We found 1160 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Develop and Optimize Data Pipelines using PySpark and SQL in Databricks to handle large-scale data transformations and batch/streaming workloads.-Design and Implement ETL Processes by leveraging Databricks notebooks, Delta Lake, and Python-based custom scripts for data ingestion and curation.-Collaborate with Data Analysts and Stakeholders to understand data needs and deliver accurate, performant, and reusable data solutions.-Maintain and Monitor Job Performance, identify bottlenecks, and apply tuning techniques to improve pipeline efficiency on the Databricks platform.Professional & Technical Skills :-Proficient in building scalable, distributed data pipelines using PySpark and Databricks with a deep understanding DataFrame APIs, and Spark optimization techniques.-Experience on Python for scripting, data manipulation, and integrating third-party libraries within data engineering workflows.-Expertise in SQL for data extraction, transformation, complex joins, and performance tuning across relational and distributed data stores.-Experienced in working with Delta Lake, versioned data, schema evolution, and managing CDC (Change Data Capture) scenarios in a cloud-based environment.-Best practices in CI/CD, code versioning (Git), and DevOps for data projects.Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Analytics Services.- This position is based at our Hyderabad office.- A 15 years full time education is required.

Responsibilities

Develop and Optimize Data Pipelines using PySpark and SQL in Databricks to handle large-scale data transformations and batch/streaming workloads.-Design and Implement ETL Processes by leveraging Databricks notebooks, Delta Lake, and Python-based custom scripts for data ingestion and curation.-Collaborate with Data Analysts and Stakeholders to understand data needs and deliver accurate, performant, and reusable data solutions.-Maintain and Monitor Job Performance, identify bottlenecks, and apply tuning techniques to improve pipeline efficiency on the Databricks platform.Professional & Technical Skills :-Proficient in building scalable, distributed data pipelines using PySpark and Databricks with a deep understanding DataFrame APIs, and Spark optimization techniques.-Experience on Python for scripting, data manipulation, and integrating third-party libraries within data engineering workflows.-Expertise in SQL for data extraction, transformation, complex joins, and performance tuning across relational and distributed data stores.-Experienced in working with Delta Lake, versioned data, schema evolution, and managing CDC (Change Data Capture) scenarios in a cloud-based environment.-Best practices in CI/CD, code versioning (Git), and DevOps for data projects.Additional Information: - The candidate should have minimum 5 years of experience in Microsoft Azure Analytics Services.- This position is based at our Hyderabad office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 1,50,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Application Developer

Job Description

Python, apache spark, apache airflow

Responsibilities

Python, apache spark, apache airflow
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cisco - Python, apache spark, apache airflow

Job Description

Dynamics 365 CE Customer Top Mandatory Skills: • Strong experience with Dynamics 365 CE Customer Service, especially Omnichannel features. • Strong Experience with PCF control development. (Power Apps framework) • Proficiency in Power Platform: Power Apps, Power Automate, Power Pages. • Good understanding of Azure services, particularly API authentication and integration.

Responsibilities

Dynamics 365 CE Customer Top Mandatory Skills: • Strong experience with Dynamics 365 CE Customer Service, especially Omnichannel features. • Strong Experience with PCF control development. (Power Apps framework) • Proficiency in Power Platform: Power Apps, Power Automate, Power Pages. • Good understanding of Azure services, particularly API authentication and integration.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Dynamics 365 CE Customer

Job Description

BMCIMS-xwiki-Jira admin Provide application support for JIRA and Confluence to IT the rest of the business Understanding of JIRA APIs and webhooks Support JIRA and Confluence users Work with business users to evaluate JIRA and Confluence plug-ins, determine licensing needs, and perform testing, installation and configuration of the plug-ins Utilizes design skills to define JIRA workflows and the related screen schemes for complex and high impact projects Implement custom xml based JIRA gadgets Support and configure JIRA plugins as required Generate documentation on workflows and processes implemented in JIRA to support runbooks Create JIRA projects, queries and reports as required

Responsibilities

BMCIMS-xwiki-Jira admin Provide application support for JIRA and Confluence to IT the rest of the business Understanding of JIRA APIs and webhooks Support JIRA and Confluence users Work with business users to evaluate JIRA and Confluence plug-ins, determine licensing needs, and perform testing, installation and configuration of the plug-ins Utilizes design skills to define JIRA workflows and the related screen schemes for complex and high impact projects Implement custom xml based JIRA gadgets Support and configure JIRA plugins as required Generate documentation on workflows and processes implemented in JIRA to support runbooks Create JIRA projects, queries and reports as required
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :BMCIMS-xwiki-Jira admin

Job Description

Resource should have experience in designing, building, refactoring and releasing software written in Java, Python.Strong experience with Big Data technologies including Hadoop, Spark, Flink,

Responsibilities

Resource should have experience in designing, building, refactoring and releasing software written in Java, Python.Strong experience with Big Data technologies including Hadoop, Spark, Flink,
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Ebay -SRE & Devops (Big Data)

Job Description

At least 5 years of proven experience working with the Forgerock platform. Extensive knowledge of AM, IDM and DS. Strong understanding of IAM concepts, protocols, and standards (e.g., SAML, OAuth, …) Proficiency in programming and scripting languages (e.g., Java, JavaScript, Groovy). Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently as well as in a team environment.

Responsibilities

At least 5 years of proven experience working with the Forgerock platform. Extensive knowledge of AM, IDM and DS. Strong understanding of IAM concepts, protocols, and standards (e.g., SAML, OAuth, …) Proficiency in programming and scripting languages (e.g., Java, JavaScript, Groovy). Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently as well as in a team environment.
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Proximus GCC | IAM Developer

Job Description

System Z - File-Aid Job Description: FileNet Developer. Experience in developing, troubleshooting and implementation of large and complex ECM solutions using IBM FileNet Content Manager and IBM BAW. Experience in FileNet issues Troubleshooting Good understanding about FileNet architecture Good knowledge in Oracle & SQL. Extensive knowledge in shell scripting Case Manager/BAW development and customization experience. Experience in IBM Content Collector for SAP/Files. Experience working in Unix and Windows server environments. Experience in ICN External Data Service (EDS) framework. Experience in ICN plugin framework, and how to develop and deploy ICN plugins. Experience in developing and deploying FileNet server side extensions (Subscriptions, Events). Experience in implementing case manager/BAW solutions and workflows. Experience in solution migration from IBM Case Manager to IBM BAW. Good Knowledge of FileNet API.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :System Z - File-Aid

Job Description

Mandatory Skills Windows OS, server Skill to Evaluate Windows OS,Mac Os Developer, server

Responsibilities

Mandatory Skills Windows OS, server Skill to Evaluate Windows OS,Mac Os Developer, server
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :C&S Infrastructure Security Engineer – Windows server OS and Mac OS

Job Description

BMCIMS-Env or Devops Manager

Responsibilities

BMCIMS-Env or Devops Manager
  • Salary : Rs. 10,00,000.0 - Rs. 25,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :BMCIMS-Env or Devops Manager