We found 1618 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Analyze and understand business requirements related to plant maintenance • Design and configure SAP PM modules to align with organizational business processes. • Implement and support end-to-end PM processes, including: o Preventive, Calibration, Refurbishment, and Corrective maintenance. • Work order and notification management. o Maintenance task lists and maintenance planning. • Integrate SAP PM with cross-functional modules, including MM, SD, FI/CO, and PS. • Manage master data setup, including Equipment, Functional Locations, Maintenance BOMs, and Task Lists. • Provide comprehensive training and technical support to both end-users and power users. • Facilitate SAP rollouts by participating in testing, go-live execution, and post-implementation Hypercare. • Develop functional specifications (RICEFW) for custom reports and enhancements in collaboration with ABAP developers. • Troubleshoot complex system issues and provide ongoing production support. • Demonstrate expertise in SAP PM Fiori applications solutions

Responsibilities

Analyze and understand business requirements related to plant maintenance • Design and configure SAP PM modules to align with organizational business processes. • Implement and support end-to-end PM processes, including: o Preventive, Calibration, Refurbishment, and Corrective maintenance. • Work order and notification management. o Maintenance task lists and maintenance planning. • Integrate SAP PM with cross-functional modules, including MM, SD, FI/CO, and PS. • Manage master data setup, including Equipment, Functional Locations, Maintenance BOMs, and Task Lists. • Provide comprehensive training and technical support to both end-users and power users. • Facilitate SAP rollouts by participating in testing, go-live execution, and post-implementation Hypercare. • Develop functional specifications (RICEFW) for custom reports and enhancements in collaboration with ABAP developers. • Troubleshoot complex system issues and provide ongoing production support. • Demonstrate expertise in SAP PM Fiori applications solutions
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP PM ,

Job Description

As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve coding, enhancing components, and collaborating with various teams to ensure the delivery of scalable and high-performing solutions. You will utilize modern frameworks and agile practices to streamline processes and improve overall efficiency in software development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth.- Should Conduct feasibility assessments to determine the most effective data scrambling approach (centralized vs. decentralized). Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM System i (AS/400) RPG IV.- Strong understanding of software development life cycle methodologies.- Experience with modern programming frameworks and agile practices with proven experience in data analysis, testing, technical project support and Understanding of MIS environments and LPAR configurations.- Ability to troubleshoot and resolve software issues efficiently and to Analyze data upload interfaces and downstream processes within the iSeries-based ODW to inform the scrambling strategy.- Familiarity with database management and integration techniques and with enterprise data warehouse environments and interface management on iSeries.- Hands-on knowledge of data scrambling, masking, privacy techniques with Developer-level proficiency in RPG ILE, CL ILE, and SQL.- Support the development of a new data scrambling tool tailored for ODW, building on the existing RPG ILE free format and CL ILE tool.- Ability to develop SQL scripts and scrambling logic for base tables and interfaces.- Should have strong documentation and communication skills, including experience with Confluence and Draw.io. Additional Information: - The candidate should have minimum 5 years of experience in IBM System i (AS/400) RPG IV.- This position is based at our Gurugram office.- A 15 years full time education is required.

Responsibilities

As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve coding, enhancing components, and collaborating with various teams to ensure the delivery of scalable and high-performing solutions. You will utilize modern frameworks and agile practices to streamline processes and improve overall efficiency in software development. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Mentor junior team members to foster their professional growth.- Should Conduct feasibility assessments to determine the most effective data scrambling approach (centralized vs. decentralized). Professional & Technical Skills: - Must To Have Skills: Proficiency in IBM System i (AS/400) RPG IV.- Strong understanding of software development life cycle methodologies.- Experience with modern programming frameworks and agile practices with proven experience in data analysis, testing, technical project support and Understanding of MIS environments and LPAR configurations.- Ability to troubleshoot and resolve software issues efficiently and to Analyze data upload interfaces and downstream processes within the iSeries-based ODW to inform the scrambling strategy.- Familiarity with database management and integration techniques and with enterprise data warehouse environments and interface management on iSeries.- Hands-on knowledge of data scrambling, masking, privacy techniques with Developer-level proficiency in RPG ILE, CL ILE, and SQL.- Support the development of a new data scrambling tool tailored for ODW, building on the existing RPG ILE free format and CL ILE tool.- Ability to develop SQL scripts and scrambling logic for base tables and interfaces.- Should have strong documentation and communication skills, including experience with Confluence and Draw.io. Additional Information: - The candidate should have minimum 5 years of experience in IBM System i (AS/400) RPG IV.- This position is based at our Gurugram office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 2,16,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Custom Software Engineer

Job Description

As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality software solutions that align with organizational goals and user needs.Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in code reviews and provide constructive feedback to peers.Professional & Technical Skills: - Must To Have Skills: Proficiency in Pega Platform.- Experience with application development methodologies and best practices.- Strong understanding of business process management and workflow automation.- Familiarity with integration techniques and tools for connecting applications.- Ability to troubleshoot and resolve application issues effectively.-Strong understanding of Declarative rules like Declare on change, Declare trigger, Declare expression etc.Should have knowledge in any CI/CD tools like Jenkins and PEGA Deployment Manager etc for deployment process.Must have manual deployment process knowledge.Must have Integrations knowledge especially in REST and SOAP both consuming and publishing.Should have good knowledge in case management.Good to have knowledge in Kafka, Queue Processors, Job Schedulers, Data flows, Datasets, Batch and Realtime processing concepts.Must have knowledge in datatypes and data tables.Good to have knowledge in PEGA debugging tools like PDC and Splunk.Good to have Constellation knowledge.Additional Information: - The candidate should have minimum 5 years of experience in Pega Platform.- Mandatory CSA & CSSA certifications.- This position is based at our Bengaluru office.- A 15 years full time education is required.

Responsibilities

As a Custom Software Engineer, you will engage in the design, construction, and configuration of applications tailored to meet specific business processes and application requirements. Your typical day will involve collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and debugging processes to deliver high-quality software solutions that align with organizational goals and user needs.Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze requirements for application development.- Participate in code reviews and provide constructive feedback to peers.Professional & Technical Skills: - Must To Have Skills: Proficiency in Pega Platform.- Experience with application development methodologies and best practices.- Strong understanding of business process management and workflow automation.- Familiarity with integration techniques and tools for connecting applications.- Ability to troubleshoot and resolve application issues effectively.-Strong understanding of Declarative rules like Declare on change, Declare trigger, Declare expression etc.Should have knowledge in any CI/CD tools like Jenkins and PEGA Deployment Manager etc for deployment process.Must have manual deployment process knowledge.Must have Integrations knowledge especially in REST and SOAP both consuming and publishing.Should have good knowledge in case management.Good to have knowledge in Kafka, Queue Processors, Job Schedulers, Data flows, Datasets, Batch and Realtime processing concepts.Must have knowledge in datatypes and data tables.Good to have knowledge in PEGA debugging tools like PDC and Splunk.Good to have Constellation knowledge.Additional Information: - The candidate should have minimum 5 years of experience in Pega Platform.- Mandatory CSA & CSSA certifications.- This position is based at our Bengaluru office.- A 15 years full time education is required.
  • Salary : Rs. 0.0 - Rs. 2,00,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Custom Software Engineer

Job Description

Level- SA /Immediate joiners only CloudFlare Adv. DDoS , CloudFlare adv DDoS protect • Primary mandate skill required – CloudFlare WAF • Secondary mandate skill required –Security Operations • Open to look at CWR’s – Yes / No - Y • Flexible to hire in any location –Yes – Flexible Detailed Job Description – Cloudflare WAF Management • Design, implement, and manage Cloudflare WAF policies across enterprise applications • Tune managed rulesets, custom rules, and rate-limiting policies to minimize false positives • Monitor, analyze, and respond to WAF security events and incidents • Implement protection against OWASP Top 10 threats, DDoS attacks, bot abuse, and API threats • Coordinate with application teams to onboard new applications to Cloudflare securely CDN & Performance Optimization • Manage Cloudflare CDN configurations for optimal performance and availability • Configure caching strategies, page rules, and traffic routing policies • Troubleshoot latency, caching, and origin connectivity issues • Support global traffic management and high-availability architectures Cloudflare Workers & Edge Logic • Develop and maintain Cloudflare Workers for edge-based logic, request validation, and traffic manipulation • Implement Workers for security use cases such as header validation, token checks, redirects, and API protection • Collaborate with developers to deploy and manage Workers in CI/CD pipelines Security Operations & Governance • Integrate Cloudflare logs with SIEM/SOC tools for monitoring and alerting • Perform regular security reviews, audits, and compliance checks • Create documentation, runbooks, and operational procedures for WAF and edge security • Stay current with Cloudflare features, threat intelligence, and industry best practices Required Skills & Qualifications • Experience in web security, network security, or cloud security • Strong hands-on experience with Cloudflare WAF, CDN, and security products • Solid understanding of HTTP/HTTPS, TLS, DNS, and web application architecture • Experience managing WAF rules, rate limiting, bot management, and DDoS protection • Working knowledge of Cloudflare Workers (JavaScript) • Experience integrating security logs with tools (Splunk,S3.) • Familiarity with OWASP Top 10, API security, and Zero Trust concepts

Responsibilities

Level- SA /Immediate joiners only CloudFlare Adv. DDoS , CloudFlare adv DDoS protect • Primary mandate skill required – CloudFlare WAF • Secondary mandate skill required –Security Operations • Open to look at CWR’s – Yes / No - Y • Flexible to hire in any location –Yes – Flexible Detailed Job Description – Cloudflare WAF Management • Design, implement, and manage Cloudflare WAF policies across enterprise applications • Tune managed rulesets, custom rules, and rate-limiting policies to minimize false positives • Monitor, analyze, and respond to WAF security events and incidents • Implement protection against OWASP Top 10 threats, DDoS attacks, bot abuse, and API threats • Coordinate with application teams to onboard new applications to Cloudflare securely CDN & Performance Optimization • Manage Cloudflare CDN configurations for optimal performance and availability • Configure caching strategies, page rules, and traffic routing policies • Troubleshoot latency, caching, and origin connectivity issues • Support global traffic management and high-availability architectures Cloudflare Workers & Edge Logic • Develop and maintain Cloudflare Workers for edge-based logic, request validation, and traffic manipulation • Implement Workers for security use cases such as header validation, token checks, redirects, and API protection • Collaborate with developers to deploy and manage Workers in CI/CD pipelines Security Operations & Governance • Integrate Cloudflare logs with SIEM/SOC tools for monitoring and alerting • Perform regular security reviews, audits, and compliance checks • Create documentation, runbooks, and operational procedures for WAF and edge security • Stay current with Cloudflare features, threat intelligence, and industry best practices Required Skills & Qualifications • Experience in web security, network security, or cloud security • Strong hands-on experience with Cloudflare WAF, CDN, and security products • Solid understanding of HTTP/HTTPS, TLS, DNS, and web application architecture • Experience managing WAF rules, rate limiting, bot management, and DDoS protection • Working knowledge of Cloudflare Workers (JavaScript) • Experience integrating security logs with tools (Splunk,S3.) • Familiarity with OWASP Top 10, API security, and Zero Trust concepts
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :CloudFlare WAF

Job Description

Key Responsibilities • Design, develop, and maintain scalable web applications using the MERN stack. • Build reusable and efficient React components and implement responsive UI. • Develop robust RESTful APIs using Node.js and Express.js. • Design and manage MongoDB databases, schemas, and queries. • Integrate third-party APIs and services. • Implement authentication, authorization, and security best practices (JWT/OAuth). • Optimize applications for performance, scalability, and reliability. • Write clean, maintainable, and testable code. • Collaborate with cross-functional teams in Agile/Scrum environment. • Participate in code reviews, troubleshooting, and debugging. • Deploy applications on cloud platforms and manage CI/CD pipelines. ________________________________________ Mandatory Skills • Strong proficiency in JavaScript (ES6+) • Expertise in React.js, Redux/Context API, Hooks • Strong backend development using Node.js and Express.js • Hands-on experience with MongoDB and Mongoose • Experience building RESTful APIs • Knowledge of HTML5, CSS3, Bootstrap/Tailwind • Experience with Git version control • Understanding of JWT, OAuth, Authentication & Authorization • Experience with Postman, API testing • Familiarity with Docker and cloud platforms (AWS/Azure/GCP) • Understanding of CI/CD pipelines

Responsibilities

Key Responsibilities • Design, develop, and maintain scalable web applications using the MERN stack. • Build reusable and efficient React components and implement responsive UI. • Develop robust RESTful APIs using Node.js and Express.js. • Design and manage MongoDB databases, schemas, and queries. • Integrate third-party APIs and services. • Implement authentication, authorization, and security best practices (JWT/OAuth). • Optimize applications for performance, scalability, and reliability. • Write clean, maintainable, and testable code. • Collaborate with cross-functional teams in Agile/Scrum environment. • Participate in code reviews, troubleshooting, and debugging. • Deploy applications on cloud platforms and manage CI/CD pipelines. ________________________________________ Mandatory Skills • Strong proficiency in JavaScript (ES6+) • Expertise in React.js, Redux/Context API, Hooks • Strong backend development using Node.js and Express.js • Hands-on experience with MongoDB and Mongoose • Experience building RESTful APIs • Knowledge of HTML5, CSS3, Bootstrap/Tailwind • Experience with Git version control • Understanding of JWT, OAuth, Authentication & Authorization • Experience with Postman, API testing • Familiarity with Docker and cloud platforms (AWS/Azure/GCP) • Understanding of CI/CD pipelines
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : MERN Stack Developer (Z2)

Job Description

Consultant Software Engineer - Kafka & NIFI Admin with Ansible & Terraform - (250008TU) Missions Position Overview: We are seeking an experienced Senior Kafka & NiFi Administrator to design, deploy, manage, and optimize enterprise‑scale data streaming and data flow platforms. The ideal candidate will have deep expertise in Apache Kafka, Kafka Connect, Schema Registry, and Apache NiFi, along with strong automation, scripting, CI/CD, and cloud-native experience. This role ensures high availability, secure configurations, performance tuning, and operational excellence for real-time data pipelines. Key Responsibilities: 1. Kafka Administration a. Install, configure, and manage Apache Kafka clusters (on‑premise or cloud-native: Azure HDInsights). b. Manage Kafka ecosystem components: Kafka Connect, Schema Registry, Kafka Streams, ZooKeeper, etc. c. Perform cluster scaling, partition rebalancing, topic management, multi‑DC replication (MirrorMaker). d. Implement monitoring, alerting, and logging using tools such as Prometheus, Grafana, ELK/ECE. e. Ensure Kafka security (TLS encryption, SASL, RBAC, Kerberos, OAuth) f. Troubleshoot broker issues, performance bottlenecks, consumer lag, and message serialization errors. 2. NiFi Administration a. Install, manage, and maintain Apache NiFi, NiFi Registry b. Design, optimize, and troubleshoot complex NiFi data flows c. Manage NiFi cluster configuration, back-pressure settings, tuning, and provenance repository d. Integrate NiFi with Kafka, S3, HDFS, RDBMS, REST APIs, and cloud services e. Implement access control, SSL/TLS security, policies, and NiFi user/group management. 3. Automation & DevOps a. Develop automation using Ansible, Terraform or Bash b. Build CI/CD pipelines for Kafka, NiFi, and data flow deployments using Jenkins, Azure DevOps, GitHub Actions. c. Automate cluster provisioning & configuration using Ansible & Terraform d. Create reusable templates and automation for topic creation, ACL management, connector deployment, and flow lifecycle. 4. Operations & Support a. Provide L3 support for streaming platforms, including incident analysis and root cause identification. b. Establish and enforce best practices for data governance, data flow reliability, and operational standards c. Maintain detailed documentation for configurations, architectures, and runbooks d. Collaborate with platform engineering, data engineering, security, SRE, and cloud teams Key skills required: 1. Must have: a. 6–10+ years of experience on Kafka & NiFi Administration b. Strong Knowledge of Bigdata Architecture & Administrator's role c. Strong Knowledge of Bigdata Hadoop, Kafka Internals, NiFi flow design, performance tuning, various data formats, Schema registry, Kafka Connect, etc. d. Configuration and Performance Tuning of Kafka & NiFi clusters e. Application Deployment and Disaster Recovery f. Automation on Bigdata Infra with the help of Ansible & Terraform 2. Good to have: a. Java & Shell scripting b. Hadoop Administration c. Excellent communication skills Profile Must have: a. 6–10+ years of experience on Kafka & NiFi Administration b. Strong Knowledge of Bigdata Architecture & Administrator's role c. Strong Knowledge of Bigdata Hadoop, Kafka Internals, NiFi flow design, performance tuning, various data formats, Schema registry, Kafka Connect, etc. d. Configuration and Performance Tuning of Kafka & NiFi clusters e. Application Deployment and Disaster Recovery f. Automation on Bigdata Infra with the help of Ansible & Terraform 2. Good to have: a. Java & Shell scripting b. Hadoop Administration c. Excellent communication skills

Responsibilities

Consultant Software Engineer - Kafka & NIFI Admin with Ansible & Terraform - (250008TU) Missions Position Overview: We are seeking an experienced Senior Kafka & NiFi Administrator to design, deploy, manage, and optimize enterprise‑scale data streaming and data flow platforms. The ideal candidate will have deep expertise in Apache Kafka, Kafka Connect, Schema Registry, and Apache NiFi, along with strong automation, scripting, CI/CD, and cloud-native experience. This role ensures high availability, secure configurations, performance tuning, and operational excellence for real-time data pipelines. Key Responsibilities: 1. Kafka Administration a. Install, configure, and manage Apache Kafka clusters (on‑premise or cloud-native: Azure HDInsights). b. Manage Kafka ecosystem components: Kafka Connect, Schema Registry, Kafka Streams, ZooKeeper, etc. c. Perform cluster scaling, partition rebalancing, topic management, multi‑DC replication (MirrorMaker). d. Implement monitoring, alerting, and logging using tools such as Prometheus, Grafana, ELK/ECE. e. Ensure Kafka security (TLS encryption, SASL, RBAC, Kerberos, OAuth) f. Troubleshoot broker issues, performance bottlenecks, consumer lag, and message serialization errors. 2. NiFi Administration a. Install, manage, and maintain Apache NiFi, NiFi Registry b. Design, optimize, and troubleshoot complex NiFi data flows c. Manage NiFi cluster configuration, back-pressure settings, tuning, and provenance repository d. Integrate NiFi with Kafka, S3, HDFS, RDBMS, REST APIs, and cloud services e. Implement access control, SSL/TLS security, policies, and NiFi user/group management. 3. Automation & DevOps a. Develop automation using Ansible, Terraform or Bash b. Build CI/CD pipelines for Kafka, NiFi, and data flow deployments using Jenkins, Azure DevOps, GitHub Actions. c. Automate cluster provisioning & configuration using Ansible & Terraform d. Create reusable templates and automation for topic creation, ACL management, connector deployment, and flow lifecycle. 4. Operations & Support a. Provide L3 support for streaming platforms, including incident analysis and root cause identification. b. Establish and enforce best practices for data governance, data flow reliability, and operational standards c. Maintain detailed documentation for configurations, architectures, and runbooks d. Collaborate with platform engineering, data engineering, security, SRE, and cloud teams Key skills required: 1. Must have: a. 6–10+ years of experience on Kafka & NiFi Administration b. Strong Knowledge of Bigdata Architecture & Administrator's role c. Strong Knowledge of Bigdata Hadoop, Kafka Internals, NiFi flow design, performance tuning, various data formats, Schema registry, Kafka Connect, etc. d. Configuration and Performance Tuning of Kafka & NiFi clusters e. Application Deployment and Disaster Recovery f. Automation on Bigdata Infra with the help of Ansible & Terraform 2. Good to have: a. Java & Shell scripting b. Hadoop Administration c. Excellent communication skills Profile Must have: a. 6–10+ years of experience on Kafka & NiFi Administration b. Strong Knowledge of Bigdata Architecture & Administrator's role c. Strong Knowledge of Bigdata Hadoop, Kafka Internals, NiFi flow design, performance tuning, various data formats, Schema registry, Kafka Connect, etc. d. Configuration and Performance Tuning of Kafka & NiFi clusters e. Application Deployment and Disaster Recovery f. Automation on Bigdata Infra with the help of Ansible & Terraform 2. Good to have: a. Java & Shell scripting b. Hadoop Administration c. Excellent communication skills
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Consultant Software Engineer

Job Description

Job Description - Lead Software Engineer - DevOps (260006HM) Lead Software Engineer - DevOps - (260006HM) Missions DevOps engineer. Key Responsibilities CI/CD Pipeline Management: Design, implement, and maintain automated pipelines to accelerate development and software delivery. Collaboration and Automation: Work with development and IT teams to automate repetitive tasks and resolve production bottlenecks. Containerization & Security: Manage containers (e.g., Docker, Kubernetes/Openshift) and enforce security best practices.  Monitoring and Reliability: Implement monitoring solutions to ensure system uptime, performance, and stability. Profile Required Skills and Qualifications Experience: Proven experience in a DevOps or Site Reliability Engineering (SRE) role. Technical Skills:  Deep knowledge of CI/CD principles, Proficiency with Linux/Unix scripting (Python, Bash), CI/CD tools (Jenkins, GitLab CI), and container orchestration (Kubernetes,Openshift), Knowledge of microservice architecture, Spring framework, mobile development languages Tools: Experience with Git, Artifactory/xRay, SonarQube, monitoring tools (Prometheus, Grafana). Soft Skills: Strong problem-solving, analytical, and communication skills to foster collaboration between teams. 

Responsibilities

Job Description - Lead Software Engineer - DevOps (260006HM) Lead Software Engineer - DevOps - (260006HM) Missions DevOps engineer. Key Responsibilities CI/CD Pipeline Management: Design, implement, and maintain automated pipelines to accelerate development and software delivery. Collaboration and Automation: Work with development and IT teams to automate repetitive tasks and resolve production bottlenecks. Containerization & Security: Manage containers (e.g., Docker, Kubernetes/Openshift) and enforce security best practices.  Monitoring and Reliability: Implement monitoring solutions to ensure system uptime, performance, and stability. Profile Required Skills and Qualifications Experience: Proven experience in a DevOps or Site Reliability Engineering (SRE) role. Technical Skills:  Deep knowledge of CI/CD principles, Proficiency with Linux/Unix scripting (Python, Bash), CI/CD tools (Jenkins, GitLab CI), and container orchestration (Kubernetes,Openshift), Knowledge of microservice architecture, Spring framework, mobile development languages Tools: Experience with Git, Artifactory/xRay, SonarQube, monitoring tools (Prometheus, Grafana). Soft Skills: Strong problem-solving, analytical, and communication skills to foster collaboration between teams. 
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Lead Software Engineer - DevOps

Job Description

Specialist Software Engineer-DevOps Engineer - (260005C7) Missions Job description: Our team Data-as-a-service is expanding, and we need 3 big data experts to join us! Today we manage SG Data platform service offers and data tools: · Data ingestion: to ingest the data from a data source · Data processing: to process, transform and distribute the data to the target location · Data exposition : to distribute and expose the data hosted in the data platform · Data cleaning: to purge the data on the Lake in a secure way · Data tools: data formatter, data anonymizer As a big data engineer, you’ll develop, maintain and create new cloud-oriented features hand-by-hand with DDS/Data Foundation teams to bring forward the data approach across all SG entities and subsidiaries. Profile Experience: 3-5 Years Skills: - Computer programming: Spark, Scala, Java, Python - Operating system knowledge for Unix, Linux - Databases and SQL - ETL and data warehousing - Hadoop - Apach Spark Tasks: · Maintain an excellent service level of the platform and avoid issues. · Researching new methods of obtaining valuable data and improving its quality · Creating and enhancing data solutions using various programming languages and tools · Creating and enhancing data architectures that meet the requirements of the business · Communicate inside if the tech league group to find new approaches

Responsibilities

Specialist Software Engineer-DevOps Engineer - (260005C7) Missions Job description: Our team Data-as-a-service is expanding, and we need 3 big data experts to join us! Today we manage SG Data platform service offers and data tools: · Data ingestion: to ingest the data from a data source · Data processing: to process, transform and distribute the data to the target location · Data exposition : to distribute and expose the data hosted in the data platform · Data cleaning: to purge the data on the Lake in a secure way · Data tools: data formatter, data anonymizer As a big data engineer, you’ll develop, maintain and create new cloud-oriented features hand-by-hand with DDS/Data Foundation teams to bring forward the data approach across all SG entities and subsidiaries. Profile Experience: 3-5 Years Skills: - Computer programming: Spark, Scala, Java, Python - Operating system knowledge for Unix, Linux - Databases and SQL - ETL and data warehousing - Hadoop - Apach Spark Tasks: · Maintain an excellent service level of the platform and avoid issues. · Researching new methods of obtaining valuable data and improving its quality · Creating and enhancing data solutions using various programming languages and tools · Creating and enhancing data architectures that meet the requirements of the business · Communicate inside if the tech league group to find new approaches
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Specialist Software Engineer-DevOps Engineer

Job Description

Consultant Software Engineer - Big Data Azure Administrator with Java - (2600057T) Missions The usability of the implemented presentations Application package expertise The overall production schedule Automation based on Cloudera CDP 7.1 based on Ansible and Terraform in a hybrid context of internal cloud and physical servers Maintenance of production: resolution of customer tickets, infrastructure incidents, management of new requests from our beneficiaries, management of task forces presenting complex technical issues Delivery of features that bring value to our beneficiaries involving infrastructure and/or infrastructure resource billing system upgrades The implementation of tests of new components, their deployment in production and the updating of customer and/or internal documentation. Pilot support: The beneficiary wishes to improve the quality of the management of its IT systems. As such, the beneficiary wishes to benefit from the Provider's expertise in terms of Technical Architecture. Completion of major upgrades of our components provided by Cloude Be able to develop a close relationship with the Cloudera vendor as well as the team · Automation of administration and requests (self-service, APIs, scripts, production KPIs) Profile Skill-set must have: Ansible(***), Terraform(***), Hadoop(**), Python (*) 8+ years of experience Engineering graduate / MCA / Computer Science Graduate or PG. Be able to develop a close relationship with the Cloudera vendor as well as the team Automation of administration and requests (self-service, APIs, scripts, production KPIs) Skill-set must have: Ansible, Terraform, Hadoop, Python (basic level)

Responsibilities

Consultant Software Engineer - Big Data Azure Administrator with Java - (2600057T) Missions The usability of the implemented presentations Application package expertise The overall production schedule Automation based on Cloudera CDP 7.1 based on Ansible and Terraform in a hybrid context of internal cloud and physical servers Maintenance of production: resolution of customer tickets, infrastructure incidents, management of new requests from our beneficiaries, management of task forces presenting complex technical issues Delivery of features that bring value to our beneficiaries involving infrastructure and/or infrastructure resource billing system upgrades The implementation of tests of new components, their deployment in production and the updating of customer and/or internal documentation. Pilot support: The beneficiary wishes to improve the quality of the management of its IT systems. As such, the beneficiary wishes to benefit from the Provider's expertise in terms of Technical Architecture. Completion of major upgrades of our components provided by Cloude Be able to develop a close relationship with the Cloudera vendor as well as the team · Automation of administration and requests (self-service, APIs, scripts, production KPIs) Profile Skill-set must have: Ansible(***), Terraform(***), Hadoop(**), Python (*) 8+ years of experience Engineering graduate / MCA / Computer Science Graduate or PG. Be able to develop a close relationship with the Cloudera vendor as well as the team Automation of administration and requests (self-service, APIs, scripts, production KPIs) Skill-set must have: Ansible, Terraform, Hadoop, Python (basic level)
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Consultant Software Engineer - Big Data Azure Administrator with Java