We found 729 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Should have 3 to 5 years of hands on experience working on Java, Spring Framework related components Should have at least 2 years of hands on experience working on using Java Spark on HDInsight or SoK8s Should have at least 2 years of hands on experience working on using Container & Orchestration tools such as Docker & Kubernetes Should have experience working on projects using Agile Methodologies and CI/CD Pipelines Should have experience working on at least one of the RDBMS databases such as Oracle, PostgreSQL and SQL Server Nice to have exposure to Linux platform such as RHEL and Cloud platforms such as Azure Data Lake Nice to have exposure to Investment Banking Domain

Responsibilities

  • Salary : Rs. 0.0 - Rs. 12.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Job Description - Specialist Software Engineer - Java + BigData (250009V9) Specialist Software Engineer

Job Description

Java Full Stack Development

Responsibilities

Java Full Stack Development
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Full Stack Development

Job Description

Job Summary: We are seeking a skilled Data Engineer with 2–4 years of hands-on experience to join our team. The ideal candidate will design, build, and maintain data pipelines and data warehouse solutions, working with ETL processes on AWS. Proficiency in SQL and Python is essential to transform raw data into valuable business insights. Key Responsibilities: Develop, maintain, and optimize ETL pipelines to ingest, transform, and load data from various sources. Design and implement data warehouse solutions to support reporting and analytics. Work with AWS data services (such as S3, Redshift, Glue, EMR, etc.) for data storage and processing. Write efficient, scalable SQL queries for data extraction, aggregation, and reporting. Develop Python scripts for data transformation, automation, and integration tasks. Monitor data pipelines, troubleshoot data issues, and ensure data quality and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Document data flows, processes, and systems for transparency and maintainability. Requirements: 2–4 years of professional experience as a Data Engineer or in a similar data-focused engineering role. Strong experience with ETL processes and tools. Hands-on experience with data warehousing concepts and implementation. Good knowledge of AWS cloud services (S3, Redshift, Glue, Lambda, etc.). Proficiency in SQL (complex queries, optimization, data modeling). Strong programming skills in Python (data manipulation, scripting). Familiarity with version control systems like Git. Ability to work independently and collaboratively in a fast-paced environment. Strong analytical and problem-solving skills.

Responsibilities

Job Summary: We are seeking a skilled Data Engineer with 2–4 years of hands-on experience to join our team. The ideal candidate will design, build, and maintain data pipelines and data warehouse solutions, working with ETL processes on AWS. Proficiency in SQL and Python is essential to transform raw data into valuable business insights. Key Responsibilities: Develop, maintain, and optimize ETL pipelines to ingest, transform, and load data from various sources. Design and implement data warehouse solutions to support reporting and analytics. Work with AWS data services (such as S3, Redshift, Glue, EMR, etc.) for data storage and processing. Write efficient, scalable SQL queries for data extraction, aggregation, and reporting. Develop Python scripts for data transformation, automation, and integration tasks. Monitor data pipelines, troubleshoot data issues, and ensure data quality and reliability. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Document data flows, processes, and systems for transparency and maintainability. Requirements: 2–4 years of professional experience as a Data Engineer or in a similar data-focused engineering role. Strong experience with ETL processes and tools. Hands-on experience with data warehousing concepts and implementation. Good knowledge of AWS cloud services (S3, Redshift, Glue, Lambda, etc.). Proficiency in SQL (complex queries, optimization, data modeling). Strong programming skills in Python (data manipulation, scripting). Familiarity with version control systems like Git. Ability to work independently and collaboratively in a fast-paced environment. Strong analytical and problem-solving skills.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Engineer

Job Description

Description Job Title: DevSecOps Engineer with 4+ years’ experience Job Summary We're looking for a dynamic DevSecOps Engineer to lead the charge in embedding security into our DevOps lifecycle. This role focuses on implementing secure, scalable, and observable cloud-native systems, leveraging Azure, Kubernetes, GitHub Actions, and security tools like Black Duck, SonarQube, and Snyk. Key Responsibilities • Architect, deploy, and manage secure Azure infrastructure using Terraform and Infrastructure as Code (IaC) principles • Build and maintain CI/CD pipelines in GitHub Actions, integrating tools such as Black Duck, SonarQube, and Snyk • Operate and optimize Azure Kubernetes Service (AKS) for containerized applications • Configure robust monitoring and observability stacks using Prometheus, Grafana, and Loki • Implement incident response automation with PagerDuty • Manage and support MS SQL databases and perform basic operations on Cosmos DB • Collaborate with development teams to promote security best practices across SDLC • Identify vulnerabilities early and respond to emerging security threats proactively Required Skills • Deep knowledge of Azure Services, AKS, and Terraform • Strong proficiency with Git, GitHub Actions, and CI/CD workflow design • Hands-on experience integrating and managing Black Duck, SonarQube, and Snyk • Proficiency in setting up monitoring stacks: Prometheus, Grafana, and Loki • Familiarity with PagerDuty for on-call and incident response workflows • Experience managing MSSQL and understanding Cosmos DB basics • Strong scripting ability (Python, Bash, or PowerShell) • Understanding of DevSecOps principles and secure coding practices • Familiarity with Helm, Bicep, container scanning, and runtime security solutions

Responsibilities

Description Job Title: DevSecOps Engineer with 4+ years’ experience Job Summary We're looking for a dynamic DevSecOps Engineer to lead the charge in embedding security into our DevOps lifecycle. This role focuses on implementing secure, scalable, and observable cloud-native systems, leveraging Azure, Kubernetes, GitHub Actions, and security tools like Black Duck, SonarQube, and Snyk. Key Responsibilities • Architect, deploy, and manage secure Azure infrastructure using Terraform and Infrastructure as Code (IaC) principles • Build and maintain CI/CD pipelines in GitHub Actions, integrating tools such as Black Duck, SonarQube, and Snyk • Operate and optimize Azure Kubernetes Service (AKS) for containerized applications • Configure robust monitoring and observability stacks using Prometheus, Grafana, and Loki • Implement incident response automation with PagerDuty • Manage and support MS SQL databases and perform basic operations on Cosmos DB • Collaborate with development teams to promote security best practices across SDLC • Identify vulnerabilities early and respond to emerging security threats proactively Required Skills • Deep knowledge of Azure Services, AKS, and Terraform • Strong proficiency with Git, GitHub Actions, and CI/CD workflow design • Hands-on experience integrating and managing Black Duck, SonarQube, and Snyk • Proficiency in setting up monitoring stacks: Prometheus, Grafana, and Loki • Familiarity with PagerDuty for on-call and incident response workflows • Experience managing MSSQL and understanding Cosmos DB basics • Strong scripting ability (Python, Bash, or PowerShell) • Understanding of DevSecOps principles and secure coding practices • Familiarity with Helm, Bicep, container scanning, and runtime security solutions
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :DevSecOps Engineer

Job Description

Description Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest API’s etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment.

Responsibilities

Description Role: Data Engineer/ETL Developer - Talend/Power BI Job Description: 1. Study, analyze and understand business requirements in context to business intelligence and provide the end-to-end solutions. 2. Design and Implement ETL pipelines with data quality and integrity across platforms like Talend Enterprise, informatica 3. Load the data from heterogeneous sources like Oracle, MSSql, File system, FTP services, Rest API’s etc.. 4. Design and map data models to shift raw data into meaningful insights and build data catalog. 5. Develop strong data documentation about algorithms, parameters, models. 6. Analyze previous and present data for better decision making. 7. Make essential technical changes to improvise present business intelligence systems. 8. Optimizing ETL processes for improved performance, monitoring ETL jobs and troubleshooting issues. 9. Lead and oversee the Team deliverables, ensure best practices are followed for development. 10. Participate/lead in requirements gathering and analysis. Required Skillset and Experience: 1. Over all up to 3 years of working experience, preferably in SQL, ETL (Talend) 2. Must have 1+ years of experience in Talend Enterprise/Open studio and related tools like Talend API, Talend Data Catalog, TMC, TAC etc. 3. Must have understanding of database design, data modeling 4. Hands-on experience in any of the coding language (Java or Python etc.) Secondary Skillset/Good to have: 1. Experience in BI Tool like MS Power Bi. 2. Utilize Power BI to build interactive and visually appealing dashboards and reports. Required Personal & Interpersonal Skills • Strong Analytical skills • Good communication skills, both written and verbal. • Highly motivated and result-oriented • Self-driven independent work ethics that drives internal and external accountability • Ability to interpret instructions to executives and technical resources. • Advanced problem-solving skills dealing with complex distributed applications. • Experience of working in multicultural environment.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :ETL Developer

Job Description

Job Description: Kronos Workforce App for Time & Labor ActivitiesThis is a critical system used for real time Work Order Labor (time) vouchering and prepares (weekly, monthly) hours for pay for payroll. Essential Skills: Kronos Workforce App for Time & Labor ActivitiesThis is a critical system used for real time Work Order Labor (time) vouchering and prepares (weekly, monthly) hours for pay for payroll.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : Kronos Workforce Management Technical

Job Description

Job Description: Exp. Range 5-6 Yrs. Sr dev. Skills: Apex, LWC, Integration Sales/service cloud/Experience cloud Essential Skills: Exp. Range 5-6 Yrs. Sr dev. Skills: Apex, LWC, Integration Sales/service cloud/Experience cloud

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Salesforce Development and Technical Design

Job Description

C++ Developer

Responsibilities

C++ Developer
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :C++ Developer

Job Description

Role Description: Should coordinate among different teams in Data migration and cleansing project to find out the right solution for different problems that are encountered on day to day basis.i.e troubleshoot migration issues. Escalate to right solution team when there is a suspected issue with solution. Guide team and help in setting up new people in SAP Team that helps in doing manual migrations .Propose automation where possible . Kindly refer below details and help with relevant profiles 1. IT experience of 6+ years . 2. Should be able to create PLM Use case- and process creation and definition within one or more areas 3. Should help the team to coordinate with Business experts, understand the business process and finalize documentation and understandings. 4. Should have experience in Experience in Engineering record (Web UI), DMS, BoM, Material Master. 5. Should be familiar with Teamcenter - SAP integration 6. Should have experience in Data migration tools like LSMW and ability to propose and learn industry standard data migration tools. 7. Should have Experience in both Agile and Waterfall Methodologies. Work with Product owner on artifacts such as Product Backlog, Spring Backlog etc Essential Skills: • IT experience of 8+ years. • Should be able to create PLM Use case- and process creation and definition within one or more areas. • Should help the team to coordinate with Business experts, understand the business process and finalize documentation and understandings. • Should have experience in Experience in Engineering record (Web UI), DMS, BoM, Material Master. • Should be familiar with Teamcenter - SAP integration6. • Should have experience in Data migration tools like LSMW and ability to propose and learn industry standard data migration tools. • Should have Experience in both Agile and Waterfall Methodologies. • Work with Product owner on artifacts such as Product Backlog, Spring Backlog etc Desirable Skills SAP Technical knowledge. Worked in Data migration or cleansing projects

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP Supply Network Collaboration (SNC)