We found 34 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Cybersecurity Engineering - API Security

Responsibilities

Cybersecurity Engineering - API Security
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cybersecurity Engineering - API Security

Job Description

Cybersecurity Engineering- Network Security

Responsibilities

Cybersecurity Engineering- Network Security
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cybersecurity Engineering- Network Security

Job Description

associate - OT Security

Responsibilities

associate - OT Security
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :associate - OT Security

Job Description

Must-Have • Reports (Classical, Interactive, ALV) • SmartForms / SAP Scripts / Adobe Forms • Enhancements (User Exits, BADI, BAPI, Enhancement Framework) • Interfaces (IDoc, RFC, BAPI, Web Services, Proxy, SOAP and REST) • Dialog programming • Able to coordinate with different TCS internal Teams and Customer. Good to have • Knowledge of ERP-SAP (SD and MM) functionality. • Sales process flow • Managing Integration with other systems as well as SAP modules. SN Responsibility of / Expectations from the Role 1 • Analyze new requirements for technical work and provide best possible solution. 2 • Effort Estimation for new developments. 3 • Coordinate with customers & other business stakeholders 4 • Ensuring standards and Best practice in terms of SAP ABAP work Skills: SAP ERP Materials Management (MM) Experience Required: 4-6

Responsibilities

Must-Have • Reports (Classical, Interactive, ALV) • SmartForms / SAP Scripts / Adobe Forms • Enhancements (User Exits, BADI, BAPI, Enhancement Framework) • Interfaces (IDoc, RFC, BAPI, Web Services, Proxy, SOAP and REST) • Dialog programming • Able to coordinate with different TCS internal Teams and Customer. Good to have • Knowledge of ERP-SAP (SD and MM) functionality. • Sales process flow • Managing Integration with other systems as well as SAP modules. SN Responsibility of / Expectations from the Role 1 • Analyze new requirements for technical work and provide best possible solution. 2 • Effort Estimation for new developments. 3 • Coordinate with customers & other business stakeholders 4 • Ensuring standards and Best practice in terms of SAP ABAP work Skills: SAP ERP Materials Management (MM) Experience Required: 4-6
  • Salary : Rs. 70,000.0 - Rs. 1,40,000.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :SAP ERP

Job Description

Associate - Cybersecurity Engineering |

Responsibilities

Associate - Cybersecurity Engineering |
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Cybersecurity Engineering |

Job Description

: Microservices~ Java API Management & Microservices~ Core Java ob Requirements* • Build scalable Cloud native Spring Boot microservices and RESTful APIs. • Ensure code quality with unit/integration tests (JUnit) and static analysis (Sonar) • Good understanding of Non-Functional Requirement • Design data models and access layers for SQL (e.g., SQL Server/PostgreSQL) with performance and reliability in mind. • Containerize services with Docker and deploy to Kubernetes (AKS). • Implement logging, metrics, and distributed tracing (Datadog/Prometheus). • Apply security best practices (OAuth2/OIDC, secrets management, TLS, dependency scanning). • Build pipelines in Gitlab/Jenkins for build, test, security scans, containerization, and progressive delivery (blue/green or canary). • Good knowledge on automated rollbacks, versioning, and release governance. • Participate in backlog grooming, estimation, and agile ceremonies. • Produce high-quality technical documentation (API specs, architecture diagrams). • Troubleshoot production issues and contribute to SRE practices (SLIs/SLOs/error budgets) in partnership with platform teams. Key Responsibilities* • Strong in Core Java, Spring boot; solid grasp of Design Pattern, Exception framework, Event Driven Architecture, Api Gateway • Requirement analysis, Design, Development, POC • Ensure the best possible performance, quality, and responsiveness of the applications • Identify bottlenecks and bugs, and devise solutions to these problems • Help maintain code quality, organization, and automatization

Responsibilities

ob Requirements* • Build scalable Cloud native Spring Boot microservices and RESTful APIs. • Ensure code quality with unit/integration tests (JUnit) and static analysis (Sonar) • Good understanding of Non-Functional Requirement • Design data models and access layers for SQL (e.g., SQL Server/PostgreSQL) with performance and reliability in mind. • Containerize services with Docker and deploy to Kubernetes (AKS). • Implement logging, metrics, and distributed tracing (Datadog/Prometheus). • Apply security best practices (OAuth2/OIDC, secrets management, TLS, dependency scanning). • Build pipelines in Gitlab/Jenkins for build, test, security scans, containerization, and progressive delivery (blue/green or canary). • Good knowledge on automated rollbacks, versioning, and release governance. • Participate in backlog grooming, estimation, and agile ceremonies. • Produce high-quality technical documentation (API specs, architecture diagrams). • Troubleshoot production issues and contribute to SRE practices (SLIs/SLOs/error budgets) in partnership with platform teams. Key Responsibilities* • Strong in Core Java, Spring boot; solid grasp of Design Pattern, Exception framework, Event Driven Architecture, Api Gateway • Requirement analysis, Design, Development, POC • Ensure the best possible performance, quality, and responsiveness of the applications • Identify bottlenecks and bugs, and devise solutions to these problems • Help maintain code quality, organization, and automatization
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role : : Microservices~ Java API Management & Microservices~ Core Java

Job Description

Java Springboot, exposure on cloud

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java Springboot, exposure on cloud

Job Description

We are seeking an experienced Data Modeler with strong expertise in Data Vault 2.0 architecture to support incremental delivery of data solutions for Property & Casualty (P&C) Insurance business processes. The ideal candidate will collaborate closely with Business Analysts, developers, and cross-functional teams to design robust data models and enhance data integration workflows on the Snowflake platform. ________________________________________ Key Responsibilities • Design and develop logical and physical data models using Data Vault 2.0 architecture to support P&C Insurance business processes. • Work closely with Business Analysts to understand requirements and document mapping rules for loading data from source systems (Guidewire and legacy platforms) into the target data models. • Optimize and refine physical data models for enhanced performance, scalability, and reliability in Snowflake. • Collaborate with development and integration teams to ensure data pipelines align with the proposed data models and architectural standards. • Conduct model reviews and ensure compliance with industry best practices, governance, and data quality standards. • Support incremental delivery and continuous improvement of the enterprise data warehouse. ________________________________________ Experience & Skills Required • Proven experience designing and implementing data models for P&C Insurance domain. • Expert-level knowledge of Data Vault 2.0 modeling techniques and principles. • Hands-on experience with data modeling tools such as Erwin. • Strong understanding of relational and dimensional database modeling concepts. • Excellent analytical, problem solving, and conceptual design skills. • Strong communication and collaboration abilities for working with cross-functional teams. • Proficiency in SQL for analyzing source systems and validating data models. • Experience working with Snowflake and Guidewire systems is a plus.

Responsibilities

We are seeking an experienced Data Modeler with strong expertise in Data Vault 2.0 architecture to support incremental delivery of data solutions for Property & Casualty (P&C) Insurance business processes. The ideal candidate will collaborate closely with Business Analysts, developers, and cross-functional teams to design robust data models and enhance data integration workflows on the Snowflake platform. ________________________________________ Key Responsibilities • Design and develop logical and physical data models using Data Vault 2.0 architecture to support P&C Insurance business processes. • Work closely with Business Analysts to understand requirements and document mapping rules for loading data from source systems (Guidewire and legacy platforms) into the target data models. • Optimize and refine physical data models for enhanced performance, scalability, and reliability in Snowflake. • Collaborate with development and integration teams to ensure data pipelines align with the proposed data models and architectural standards. • Conduct model reviews and ensure compliance with industry best practices, governance, and data quality standards. • Support incremental delivery and continuous improvement of the enterprise data warehouse. ________________________________________ Experience & Skills Required • Proven experience designing and implementing data models for P&C Insurance domain. • Expert-level knowledge of Data Vault 2.0 modeling techniques and principles. • Hands-on experience with data modeling tools such as Erwin. • Strong understanding of relational and dimensional database modeling concepts. • Excellent analytical, problem solving, and conceptual design skills. • Strong communication and collaboration abilities for working with cross-functional teams. • Proficiency in SQL for analyzing source systems and validating data models. • Experience working with Snowflake and Guidewire systems is a plus.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Data Modeler

Job Description

Experience and Skills: • Experience in data engineering, specifically with Snowflake and DBT. • Key skills required are strong hands-on experience with DBT • expertise in building and managing data models using DBT commands, Jinja macros, and configurations, and proficiency in developing and managing DBT projects, testing, and documentation. • Strong SQL proficiency, including advanced concepts, is essential. • Experience with Snowflake's architecture and optimizing SQL queries for the platform is also necessary. • A solid understanding of data warehousing architectures and ETL/ELT processes, data transformation, and quality is expected. • Proficiency with cloud platforms like AWS, Azure, or GCP and experience with version control systems like Git are often required. • Familiarity with CI/CD pipelines and workflow management tools is also beneficial. • Excellent problem-solving, communication, and collaboration skills are vital, as is the ability to work in an agile environment --------------------------------------

Responsibilities

Experience and Skills: • Experience in data engineering, specifically with Snowflake and DBT. • Key skills required are strong hands-on experience with DBT • expertise in building and managing data models using DBT commands, Jinja macros, and configurations, and proficiency in developing and managing DBT projects, testing, and documentation. • Strong SQL proficiency, including advanced concepts, is essential. • Experience with Snowflake's architecture and optimizing SQL queries for the platform is also necessary. • A solid understanding of data warehousing architectures and ETL/ELT processes, data transformation, and quality is expected. • Proficiency with cloud platforms like AWS, Azure, or GCP and experience with version control systems like Git are often required. • Familiarity with CI/CD pipelines and workflow management tools is also beneficial. • Excellent problem-solving, communication, and collaboration skills are vital, as is the ability to work in an agile environment --------------------------------------
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Design, Develop, Document, Test and Implement