As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Good To Have Skills: Experience with data integration tools and platforms.- Strong understanding of data modeling concepts and practices.- Familiarity with cloud-based data solutions and architectures.- Experience in optimizing data workflows and processes. Additional Information: - The candidate should have minimum 3 years of experience in Data Building Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Responsibilities
As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to enhance the overall data architecture and platform functionality. Roles & Responsibilities: - Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in continuous learning to stay updated with industry trends and technologies.- Assist in the documentation of data platform processes and best practices. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Good To Have Skills: Experience with data integration tools and platforms.- Strong understanding of data modeling concepts and practices.- Familiarity with cloud-based data solutions and architectures.- Experience in optimizing data workflows and processes. Additional Information: - The candidate should have minimum 3 years of experience in Data Building Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 1,45,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
4 to 6 years of work experience
With at least 2 years of experience in NodeJS, Express js and building Microservices.
With ability to Probe the limitations and drive for continuous learning
With Strong Problem-Solving skills, Clean Code and Test-driven development.
Be obsessed about delivering zero defect code
Roles and Responsibilities
Application of Design principles and Design Patterns to solve common problems
Extensively write code in Javascript for Node.JS
Write Unit tests in Mocha JS to cover written code
Follow practice of standard work and be Agile
Own 1 or more component from design, development and end to end support
Responsibilities
4 to 6 years of work experience
With at least 2 years of experience in NodeJS, Express js and building Microservices.
With ability to Probe the limitations and drive for continuous learning
With Strong Problem-Solving skills, Clean Code and Test-driven development.
Be obsessed about delivering zero defect code
Roles and Responsibilities
Application of Design principles and Design Patterns to solve common problems
Extensively write code in Javascript for Node.JS
Write Unit tests in Mocha JS to cover written code
Follow practice of standard work and be Agile
Own 1 or more component from design, development and end to end support
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Key Responsibilities:
Design, implement, and manage CI/CD pipelines using Azure DevOps and GitHub Actions
Manage and maintain Azure Kubernetes Service (AKS) and Amazon EKS
Configure and integrate Azure Storage and AWS S3 with Kubernetes clusters
Implement Infrastructure as Code using Terraform across Azure and AWS
Monitor cloud infrastructure using DataDog, CloudWatch, and Kubernetes-native tools
Set up and configure logging and monitoring using Prometheus, Grafana, and Fluentd
Configure network and security policies for AKS, EKS, and other Kubernetes clusters
Automate infrastructure provisioning and deployments using scripting languages (Shell, Bash, PowerShell)
Debug and troubleshoot issues related to clusters, nodes, and pods
Work with Postgres, DB2, and other database technologies
Use Docker, Git, and Helm Charts for containerization and deployment
Collaborate with cross-functional teams to ensure DevOps best practices are followed
Required Skills:
Proven experience in Azure DevOps, AKS, CI/CD pipelines, Terraform, and Kubernetes
Hands-on experience with AWS services such as EC2, S3, IAM, CloudFormation, CloudWatch, and EKS
Strong knowledge of monitoring and logging tools: DataDog, CloudWatch, Prometheus, Grafana
Experience with Docker, Git, Helm, and scripting languages
Familiarity with networking and security policies in cloud-native environments
Strong problem-solving, analytical, and communication skills
Responsibilities
Key Responsibilities:
Design, implement, and manage CI/CD pipelines using Azure DevOps and GitHub Actions
Manage and maintain Azure Kubernetes Service (AKS) and Amazon EKS
Configure and integrate Azure Storage and AWS S3 with Kubernetes clusters
Implement Infrastructure as Code using Terraform across Azure and AWS
Monitor cloud infrastructure using DataDog, CloudWatch, and Kubernetes-native tools
Set up and configure logging and monitoring using Prometheus, Grafana, and Fluentd
Configure network and security policies for AKS, EKS, and other Kubernetes clusters
Automate infrastructure provisioning and deployments using scripting languages (Shell, Bash, PowerShell)
Debug and troubleshoot issues related to clusters, nodes, and pods
Work with Postgres, DB2, and other database technologies
Use Docker, Git, and Helm Charts for containerization and deployment
Collaborate with cross-functional teams to ensure DevOps best practices are followed
Required Skills:
Proven experience in Azure DevOps, AKS, CI/CD pipelines, Terraform, and Kubernetes
Hands-on experience with AWS services such as EC2, S3, IAM, CloudFormation, CloudWatch, and EKS
Strong knowledge of monitoring and logging tools: DataDog, CloudWatch, Prometheus, Grafana
Experience with Docker, Git, Helm, and scripting languages
Familiarity with networking and security policies in cloud-native environments
Strong problem-solving, analytical, and communication skills
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Development & Integration, Level 9Summary:We are seeking a skilled Cloudera SME with deep expertise in Cloudera Data Platform (CDP) and hands-on experience in development and integration. The candidate will be responsible for architecting, implementing, and optimizing data solutions using Cloudera, ensuring seamless integration with enterprise systems and cloud platforms.Roles & Responsibilities:Key Responsibilities:Architect and implement scalable data solutions using Cloudera Data Platform.Develop and optimize ETL/ELT pipelines for structured and unstructured data.Integrate Cloudera with enterprise applications and cloud services.Configure and manage security, governance, and compliance frameworks.Troubleshoot and resolve issues related to Cloudera platform and integrations.Design and maintain dashboards for monitoring data flows, performance, and health.Collaborate with cross-functional teams to deliver business-critical data solutions.Professional Attributes:Excellent verbal and written communication skills.Ability to work independently and collaboratively in a fast-paced environment.Strong analytical and problem-solving skills.Professional & Technical skillsMust Have Skills:- Cloudera Product Expertise (CDP, Hadoop, Spark, Hive, Impala, NiFi, Kafka)- Strong experience in Python/Java/Scala for data engineering and automation- Integration of Cloudera with cloud platforms (AWS, Azure, GCP)- Data pipeline design and orchestration (NiFi, Airflow)- Security and governance (Ranger, Atlas)Good to Have Skills:- Experience with containerization (Docker, Kubernetes)- Familiarity with ServiceNow ITOM or similar ITSM tools- DataOps and CI/CD pipeline implementation- Exposure to Data Observability and Monitoring toolsAdditional Information:Minimum of 3.5 years of experience in Cloudera.Proficiency in Python, Java, or Scala for data engineering tasks.Experience with NiFi, Kafka, and cloud integration.Knowledge of data security, governance, and monitoring best practices.Educational Qualification and Certification:Graduate Degree in Computer Science, Information Technology, or related field.Relevant Cloudera certifications (e.g., CCA, CCP) are a plus.
Responsibilities
Development & Integration, Level 9Summary:We are seeking a skilled Cloudera SME with deep expertise in Cloudera Data Platform (CDP) and hands-on experience in development and integration. The candidate will be responsible for architecting, implementing, and optimizing data solutions using Cloudera, ensuring seamless integration with enterprise systems and cloud platforms.Roles & Responsibilities:Key Responsibilities:Architect and implement scalable data solutions using Cloudera Data Platform.Develop and optimize ETL/ELT pipelines for structured and unstructured data.Integrate Cloudera with enterprise applications and cloud services.Configure and manage security, governance, and compliance frameworks.Troubleshoot and resolve issues related to Cloudera platform and integrations.Design and maintain dashboards for monitoring data flows, performance, and health.Collaborate with cross-functional teams to deliver business-critical data solutions.Professional Attributes:Excellent verbal and written communication skills.Ability to work independently and collaboratively in a fast-paced environment.Strong analytical and problem-solving skills.Professional & Technical skillsMust Have Skills:- Cloudera Product Expertise (CDP, Hadoop, Spark, Hive, Impala, NiFi, Kafka)- Strong experience in Python/Java/Scala for data engineering and automation- Integration of Cloudera with cloud platforms (AWS, Azure, GCP)- Data pipeline design and orchestration (NiFi, Airflow)- Security and governance (Ranger, Atlas)Good to Have Skills:- Experience with containerization (Docker, Kubernetes)- Familiarity with ServiceNow ITOM or similar ITSM tools- DataOps and CI/CD pipeline implementation- Exposure to Data Observability and Monitoring toolsAdditional Information:Minimum of 3.5 years of experience in Cloudera.Proficiency in Python, Java, or Scala for data engineering tasks.Experience with NiFi, Kafka, and cloud integration.Knowledge of data security, governance, and monitoring best practices.Educational Qualification and Certification:Graduate Degree in Computer Science, Information Technology, or related field.Relevant Cloudera certifications (e.g., CCA, CCP) are a plus.
Salary : Rs. 0.0 - Rs. 1,65,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: ~BANGALORE~
Skill Required: Java Build and Deployment
Experience Range in Required Skills:4 to 6 Years
Job Description:
Java developer
Essential Skills:
Java developer
Responsibilities
Job Title: Developer
Work Location: ~BANGALORE~
Skill Required: Java Build and Deployment
Experience Range in Required Skills:4 to 6 Years
Job Description:
Java developer
Essential Skills:
Java developer
Salary : Rs. 55,000.0 - Rs. 95,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance