ob Description:
Primary Skill: Java spring boot microservices with GCP, Kafka
- Design, develop, and maintain scalable full-stack applications using Java (Spring Boot) and react.
- Build and integrate event-driven systems using Apache Kafka within a GCP environment.
- Develop RESTful APIs and work with microservices architecture.
- Collaborate with cross-functional teams (DevOps, Product, QA) to deliver high-quality solutions.
- Ensure system responsiveness, performance, and scalability.
- Participate in code reviews, testing, and debugging.
- Leverage GCP services (e.g., PubSub, Cloud Functions, BigQuery) to optimize application performance.
- Write clean, maintainable, and testable code
Responsibilities
ob Description:
Primary Skill: Java spring boot microservices with GCP, Kafka
- Design, develop, and maintain scalable full-stack applications using Java (Spring Boot) and react.
- Build and integrate event-driven systems using Apache Kafka within a GCP environment.
- Develop RESTful APIs and work with microservices architecture.
- Collaborate with cross-functional teams (DevOps, Product, QA) to deliver high-quality solutions.
- Ensure system responsiveness, performance, and scalability.
- Participate in code reviews, testing, and debugging.
- Leverage GCP services (e.g., PubSub, Cloud Functions, BigQuery) to optimize application performance.
- Write clean, maintainable, and testable code
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
ob Description:
1.Good understanding of data warehousing concepts with hands-on experience working across multiple databases, including Oracle, PostgreSQL, and MySQL.
2.Extensive experience in Databricks, with a focus on building scalable data solutions.
3.Proven ability to design, develop, and maintain robust ETLELT pipelines using Databricks to extract, transform, and load data from diverse sources into target systems.
4.Strong understanding of data integration from both structured and unstructured sources such as relational databases, flat files, APIs, and cloud storage.
5.Skilled in implementing data validation, cleansing, and reconciliation processes to ensure high data quality and integrity.
6.Hands-on experience working with the AWS cloud platform, leveraging services for data processing and storage.
7.Familiar with Agile process and DevOps practices, including Jira, Confluence, GitHub, and CICD pipelines.
8.Excellent communication skills and a strong team player with a collaborative mindset.Tools Technologies Databricks , AWS Glue , Redshift , Oracle DB , Python, Jira Confluence
Responsibilities
ob Description:
1.Good understanding of data warehousing concepts with hands-on experience working across multiple databases, including Oracle, PostgreSQL, and MySQL.
2.Extensive experience in Databricks, with a focus on building scalable data solutions.
3.Proven ability to design, develop, and maintain robust ETLELT pipelines using Databricks to extract, transform, and load data from diverse sources into target systems.
4.Strong understanding of data integration from both structured and unstructured sources such as relational databases, flat files, APIs, and cloud storage.
5.Skilled in implementing data validation, cleansing, and reconciliation processes to ensure high data quality and integrity.
6.Hands-on experience working with the AWS cloud platform, leveraging services for data processing and storage.
7.Familiar with Agile process and DevOps practices, including Jira, Confluence, GitHub, and CICD pipelines.
8.Excellent communication skills and a strong team player with a collaborative mindset.Tools Technologies Databricks , AWS Glue , Redshift , Oracle DB , Python, Jira Confluence
Salary : Rs. 55,000.0 - Rs. 95,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ // PAN INDIA
Skill Required: Unix Shell Scripting and text processing tools~Application Server Deployment & Administration~Statistics & Analytics
Actual Experience Required: 6-8yrs
//Considerable: Overall 5+ yrs
*** SAS Admin - min 2yrs
***Shift: 2 - 11PM Shift ,
***Flexibility of WFH 3 - 4hrs
// Mandatory : SAS Admin Certifications
Job Description: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Essential Skills: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Comments for Suppliers: // Mandatory : SAS Admin Certifications
Responsibilities
Job Title: Engineer
Work Location: ~CHENNAI~HYDERABAD~BANGALORE~KOLKATA~ // PAN INDIA
Skill Required: Unix Shell Scripting and text processing tools~Application Server Deployment & Administration~Statistics & Analytics
Actual Experience Required: 6-8yrs
//Considerable: Overall 5+ yrs
*** SAS Admin - min 2yrs
***Shift: 2 - 11PM Shift ,
***Flexibility of WFH 3 - 4hrs
// Mandatory : SAS Admin Certifications
Job Description: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Essential Skills: Responsibilities Manage and support multiple SAS environments, ensuring high availability and performance. Conduct regular system maintenance and updates, including patches and upgrades. Monitor system performance and troubleshoot issues as they arise. Implement security measures to protect data and ensure compliance with organizational policies. Collaborate with analytics and IT teams to optimize SAS application performance. Develop and maintain system documentation and standard operating procedures related to SAS administration. Provide technical support for end-users, including issue resolution and training. Ensure data integrity and confidentiality within the SAS environment. Skills SAS administration SAS platform architecture Performance tuning System optimization UnixLinux scripting Windows Server management Data security Troubleshooting
Comments for Suppliers: // Mandatory : SAS Admin Certifications
Salary : Rs. 70,000.0 - Rs. 1,10,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: Mumbai, MH / Hyderabad, TG / Chennai, TN / Bhubaneswar, OR / Bangalore, KA
Duration: 6 months (Extendable)
Skill Required: Digital: Microsoft Azure, Digital: Python for Data Science, Digital: Databricks, Digital: PySpark, Azure Data Factory
Experience Range in Required Skills: 6-8 Years
Job Description:
1. Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.
2. Developing scalable and re-usable frameworks for ingesting data sets.
3. Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times.
4. Working with event based streaming technologies to ingest and process data.
5. Working with other members of the project team to support delivery of additional project components (API interfaces, Search).
6. Evaluating the performance and applicability of multiple tools against customer requirements
Key Responsibilities:
1. Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks.
2. Implement data ingestion flows from diverse sources including Azure Blob Storage, Azure Data Lake, On-Prem SQL, and SFTP.
3. Design and optimize data models and transformations using Oracle, Spark SQL, PySpark, SQL Server, Progress DB SQL.
4. Build orchestration workflows in ADF using activities like Lookup, ForEach, Execute Pipeline, and Set Variable.
5. Perform root cause analysis and resolve production issues in pipelines and notebooks. Collaborate on CICD pipeline creation using Azure DevOps, Jenkins.
6. Apply performance tuning techniques to Azure Synapse Analytics and SQL DW.
7. Maintain documentation including runbooks, technical design specs, and QA test cases Data Pipeline Engineering Design and implement scalable, fault-tolerant data pipelines using Azure Synapse and Data bricks.
8. Ingest data from diverse sources including flat files, DB2, NoSQL, and cloud-native formats (CSV, JSON).
9. Technical Skills Required Cloud Platforms Azure (ADF, ADLS, ADB, Azure SQL, Synapse, Cosmos DB) ETL Tools Azure Data Factory, Azure Databricks Programming SQL, PySpark, Spark SQL DevOps Automation Azure DevOps, Git, CICD, Jenkins
Responsibilities
Job Title: Developer
Work Location: Mumbai, MH / Hyderabad, TG / Chennai, TN / Bhubaneswar, OR / Bangalore, KA
Duration: 6 months (Extendable)
Skill Required: Digital: Microsoft Azure, Digital: Python for Data Science, Digital: Databricks, Digital: PySpark, Azure Data Factory
Experience Range in Required Skills: 6-8 Years
Job Description:
1. Designing and implementing data ingestion pipelines from multiple sources using Azure Databricks.
2. Developing scalable and re-usable frameworks for ingesting data sets.
3. Integrating the end-to-end data pipeline - to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times.
4. Working with event based streaming technologies to ingest and process data.
5. Working with other members of the project team to support delivery of additional project components (API interfaces, Search).
6. Evaluating the performance and applicability of multiple tools against customer requirements
Key Responsibilities:
1. Develop and maintain ETLELT pipelines using Azure Data Factory (ADF) and Azure Databricks.
2. Implement data ingestion flows from diverse sources including Azure Blob Storage, Azure Data Lake, On-Prem SQL, and SFTP.
3. Design and optimize data models and transformations using Oracle, Spark SQL, PySpark, SQL Server, Progress DB SQL.
4. Build orchestration workflows in ADF using activities like Lookup, ForEach, Execute Pipeline, and Set Variable.
5. Perform root cause analysis and resolve production issues in pipelines and notebooks. Collaborate on CICD pipeline creation using Azure DevOps, Jenkins.
6. Apply performance tuning techniques to Azure Synapse Analytics and SQL DW.
7. Maintain documentation including runbooks, technical design specs, and QA test cases Data Pipeline Engineering Design and implement scalable, fault-tolerant data pipelines using Azure Synapse and Data bricks.
8. Ingest data from diverse sources including flat files, DB2, NoSQL, and cloud-native formats (CSV, JSON).
9. Technical Skills Required Cloud Platforms Azure (ADF, ADLS, ADB, Azure SQL, Synapse, Cosmos DB) ETL Tools Azure Data Factory, Azure Databricks Programming SQL, PySpark, Spark SQL DevOps Automation Azure DevOps, Git, CICD, Jenkins
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Work Location CHENNAI, TN
Job Title : Network Administrator
Experience : 4-6 Years
Skill Required: Windows Powershell~RedHat Linux
Job description 1Install, configure, and maintain Nagios XI monitoring servers and agents2Develop and manage custom plugins, scripts, and templates.3Set up host and service checks, notifications, and escalation policies.4Integrate Nagios with other tools (e.g., ticketing systems, dashboards).5Troubleshoot and resolve issues related to polling, data collection, and performance6Collaborate with infrastructure, network, and Business stakeholders to define monitoring requirements
Essential Skills Must-Have(Ideally should not be more than 3-5)Proven Experience administering Nagios XI in enterprise environmentStrong understanding of network protocols, server infrastructure, and application monitoring.Proficiency in scripting languages (e.g., Bash, PowerShell, Python) for automation and plugin development.Familiarity with SNMP, WMI, NetFlow, and other monitoring protocols.Experience with Linux and Windows server environments.Excellent problem-solving and communication skillsGood-to-Have(Ideally should not be more than 3-5)Certifications in Nagios, SolarWinds, or related technologies.Experience integrating monitoring tools with ITSM platforms (e.g., ServiceNow).Understanding of ITIL practices and incident management workflows.Excellent verbal and written communication skills to articulate technical concepts clearly.Knowledge of cloud environments(AWS, OCI)
Responsibilities
Work Location CHENNAI, TN
Job Title : Network Administrator
Experience : 4-6 Years
Skill Required: Windows Powershell~RedHat Linux
Job description 1Install, configure, and maintain Nagios XI monitoring servers and agents2Develop and manage custom plugins, scripts, and templates.3Set up host and service checks, notifications, and escalation policies.4Integrate Nagios with other tools (e.g., ticketing systems, dashboards).5Troubleshoot and resolve issues related to polling, data collection, and performance6Collaborate with infrastructure, network, and Business stakeholders to define monitoring requirements
Essential Skills Must-Have(Ideally should not be more than 3-5)Proven Experience administering Nagios XI in enterprise environmentStrong understanding of network protocols, server infrastructure, and application monitoring.Proficiency in scripting languages (e.g., Bash, PowerShell, Python) for automation and plugin development.Familiarity with SNMP, WMI, NetFlow, and other monitoring protocols.Experience with Linux and Windows server environments.Excellent problem-solving and communication skillsGood-to-Have(Ideally should not be more than 3-5)Certifications in Nagios, SolarWinds, or related technologies.Experience integrating monitoring tools with ITSM platforms (e.g., ServiceNow).Understanding of ITIL practices and incident management workflows.Excellent verbal and written communication skills to articulate technical concepts clearly.Knowledge of cloud environments(AWS, OCI)
Salary : Rs. 55,000.0 - Rs. 95,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
1. Experience in SSIS development and production support experience, maintaining and troubleshooting multiple complex packages.
2. Experience in Azure modules like ADF, ADLS, JAMS, Databricks and Snowflake are added advantages.
3. Experience in multiple source and destination tasks
4. Experience in the migration of on-prem to Azure environment.
5. Experience in web services source/destination for packages
6. Experience in SSMS and exposure to other newest Azure tech stacks.
7. Strong SQL skills are mandatory on MS SQL server.
8. Proficient in analysis of production issues at the package/Scheduling level and will be able to suggest/work on code fixes.
9. Should be able to work in support model – Shifts/On-calls etc.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance