As an operational risk officer, you will support the SOC team in their daily activity and administrating Operational Security Processes. You will be asked to identify improvements in current processes and formalize it through clear documentation.
Among the ongoing administration of Processes, your main responsibilities will be to manage the vulnerability scan process. The process is based on Qualys Tools.
• Responsible for understanding, reviewing, and interpreting assessment and scanning results, reducing false positive findings, and acting as a trusted security advisor to the client.
• Identify and prioritize all vulnerabilities in client environments and provide timely vulnerability assessment reports to key stakeholders
• Develop and report enterprise-level metrics for vulnerabilities and remediation progress
• User requests administration: manage users request on the platforms. Add Hosts, Assets Groups, create scan, report or Dashboard (using the standard and process delivered by SOC SG). Including Emergency stop of scan.
• Manage Vulnerability Scan for GTS: Manage the Change management process to request a scan on GTS infrastructure. Manage the change creation, the achievement of the change process following by the job creation on Qualys platform.
• Present Vulnerability Assessment Scanning and guidance, False Positive Validation, Compliance Scanning and, scan profile and policy creation.
• Analysis of vulnerability: based on group standards, manage the alerting on critical vulnerability found by a vulnerability scan and follow the mitigation with remediation teams
• Ability to identify false positives
• Knowledge of vulnerability management frameworks and concepts such as CVE, and CVSS scoring systems and attacking vectors
• Dashboard: generate monthly and quarterly reports and dashboards.
• Qualys tags: Understanding of Qualys tags
• Manage Internal Qualys infrastructure: survey the status of Qualys appliances and manage the RMA process and deployment of new appliances.
• Implement automated, proactive security measures
• Hands on Qualys modules – Vulnerability Management, Policy Compliance, Web Application Scanning, Cloud Agent, Asset View,Container Security, VMDR
Responsibilities
As an operational risk officer, you will support the SOC team in their daily activity and administrating Operational Security Processes. You will be asked to identify improvements in current processes and formalize it through clear documentation.
Among the ongoing administration of Processes, your main responsibilities will be to manage the vulnerability scan process. The process is based on Qualys Tools.
• Responsible for understanding, reviewing, and interpreting assessment and scanning results, reducing false positive findings, and acting as a trusted security advisor to the client.
• Identify and prioritize all vulnerabilities in client environments and provide timely vulnerability assessment reports to key stakeholders
• Develop and report enterprise-level metrics for vulnerabilities and remediation progress
• User requests administration: manage users request on the platforms. Add Hosts, Assets Groups, create scan, report or Dashboard (using the standard and process delivered by SOC SG). Including Emergency stop of scan.
• Manage Vulnerability Scan for GTS: Manage the Change management process to request a scan on GTS infrastructure. Manage the change creation, the achievement of the change process following by the job creation on Qualys platform.
• Present Vulnerability Assessment Scanning and guidance, False Positive Validation, Compliance Scanning and, scan profile and policy creation.
• Analysis of vulnerability: based on group standards, manage the alerting on critical vulnerability found by a vulnerability scan and follow the mitigation with remediation teams
• Ability to identify false positives
• Knowledge of vulnerability management frameworks and concepts such as CVE, and CVSS scoring systems and attacking vectors
• Dashboard: generate monthly and quarterly reports and dashboards.
• Qualys tags: Understanding of Qualys tags
• Manage Internal Qualys infrastructure: survey the status of Qualys appliances and manage the RMA process and deployment of new appliances.
• Implement automated, proactive security measures
• Hands on Qualys modules – Vulnerability Management, Policy Compliance, Web Application Scanning, Cloud Agent, Asset View,Container Security, VMDR
Salary : Rs. 0.0 - Rs. 12,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Skill Required: Mainframe DB2 - Application Development
Experience Range: 8 – 10 Years
Job Description: -
We are seeking a highly skilled and experienced Mainframe DB2 – Application Developer to join our team at OP Financial Group. The ideal candidate should have strong expertise in Mainframe application development, DB2, and COBOL with a proven track record of delivering robust, scalable, and efficient solutions in the financial services domain.
Essential Skills: -
• Design, develop, and maintain mainframe applications using COBOL, DB2, JCL, CICS, and VSAM.
• Work on DB2 database design, queries, stored procedures, and performance tuning.
• Collaborate with business analysts and stakeholders to gather requirements and translate them into technical solutions.
• Perform code reviews, unit testing, and integration testing to ensure high-quality deliverables.
• Troubleshoot and resolve production issues in a timely manner.
• Ensure compliance with coding standards, security policies, and industry best practices.
• Work in an Agile/DevOps environment, contributing to sprint planning, estimations, and continuous improvement.
• Provide technical guidance and mentorship to junior developers when required.
Responsibilities
Skill Required: Mainframe DB2 - Application Development
Experience Range: 8 – 10 Years
Job Description: -
We are seeking a highly skilled and experienced Mainframe DB2 – Application Developer to join our team at OP Financial Group. The ideal candidate should have strong expertise in Mainframe application development, DB2, and COBOL with a proven track record of delivering robust, scalable, and efficient solutions in the financial services domain.
Essential Skills: -
• Design, develop, and maintain mainframe applications using COBOL, DB2, JCL, CICS, and VSAM.
• Work on DB2 database design, queries, stored procedures, and performance tuning.
• Collaborate with business analysts and stakeholders to gather requirements and translate them into technical solutions.
• Perform code reviews, unit testing, and integration testing to ensure high-quality deliverables.
• Troubleshoot and resolve production issues in a timely manner.
• Ensure compliance with coding standards, security policies, and industry best practices.
• Work in an Agile/DevOps environment, contributing to sprint planning, estimations, and continuous improvement.
• Provide technical guidance and mentorship to junior developers when required.
Salary : Rs. 65,000.0 - Rs. 1,25,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description:
1. 4+ Years of strong experience in Salesforce Integration
2. Good experience in Salesforce Community Cloud
3. Good experience in REST API & SOAP API Integration
4. Strong Analytical Skill and Communication Skills
5. Good experience in Salesforce lightning is an added advantage
6. Good Team skill
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role :Salesforce Community Cloud and integration, Salesforce Community Cloud and integration.
As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Assist in the documentation of application processes and procedures.
- Collaborate with cross-functional teams to gather requirements and provide technical insights.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in IBM AS/400 RPG III.
- Strong understanding of application development methodologies.
- Experience with debugging and troubleshooting application code.
- Familiarity with database management systems and data manipulation.
- Ability to work with version control systems for code management.
Additional Information:
- The candidate should have minimum 3 years of experience in IBM AS/400 RPG III.
- This position is based at our Bengaluru office.
- A 15 years full time education is required."
Responsibilities
As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently.
Roles & Responsibilities:
- Expected to perform independently and become an SME.
- Required active participation/contribution in team discussions.
- Contribute in providing solutions to work related problems.
- Assist in the documentation of application processes and procedures.
- Collaborate with cross-functional teams to gather requirements and provide technical insights.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in IBM AS/400 RPG III.
- Strong understanding of application development methodologies.
- Experience with debugging and troubleshooting application code.
- Familiarity with database management systems and data manipulation.
- Ability to work with version control systems for code management.
Additional Information:
- The candidate should have minimum 3 years of experience in IBM AS/400 RPG III.
- This position is based at our Bengaluru office.
- A 15 years full time education is required."
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experience: 4-6 Years
Work Location: Chennai, TN || Bangalore, KA || Hyderabad, TS
Skill Required: Digital : Bigdata and Hadoop Ecosystems Digital : PySpark
Job Description:
"? Need to work as a developer in Bigdata, Hadoop or Data Warehousing Tools and Cloud Computing ?
Work on Hadoop, Hive SQL?s, Spark, Bigdata Eco System Tools.?
Experience in working with teams in a complex organization involving multiple reporting lines.?
The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. ?
The candidate should have strong DevOps and Agile Development Framework knowledge.?
Create Scala/Spark jobs for data transformation and aggregation?
Experience with stream-processing systems like Storm, Spark-Streaming, Flink"
Essential Skills:
"? Working experience of Hadoop, Hive SQL?
s, Spark, Bigdata Eco System Tools.?
Should be able to tweak queries and work on performance enhancement. ?
The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. ?
The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects. ?
The candidate should have strong DevOps and Agile Development Framework knowledge ?
Preferable to have good technical knowledge on Cloud computing, AWS or Azure Cloud Services.?
Strong conceptual and creative problem-solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly.?
Experience in working with teams in a complex organization involving multiple reporting lines?
Solid understanding of object-oriented programming and HDFS concepts"
Responsibilities
Experience: 4-6 Years
Work Location: Chennai, TN || Bangalore, KA || Hyderabad, TS
Skill Required: Digital : Bigdata and Hadoop Ecosystems Digital : PySpark
Job Description:
"? Need to work as a developer in Bigdata, Hadoop or Data Warehousing Tools and Cloud Computing ?
Work on Hadoop, Hive SQL?s, Spark, Bigdata Eco System Tools.?
Experience in working with teams in a complex organization involving multiple reporting lines.?
The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. ?
The candidate should have strong DevOps and Agile Development Framework knowledge.?
Create Scala/Spark jobs for data transformation and aggregation?
Experience with stream-processing systems like Storm, Spark-Streaming, Flink"
Essential Skills:
"? Working experience of Hadoop, Hive SQL?
s, Spark, Bigdata Eco System Tools.?
Should be able to tweak queries and work on performance enhancement. ?
The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing. ?
The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. Occasionally, the candidate may have to be responsible as a primary contact and/or driver for small to medium size projects. ?
The candidate should have strong DevOps and Agile Development Framework knowledge ?
Preferable to have good technical knowledge on Cloud computing, AWS or Azure Cloud Services.?
Strong conceptual and creative problem-solving skills, ability to work with considerable ambiguity, ability to learn new and complex concepts quickly.?
Experience in working with teams in a complex organization involving multiple reporting lines?
Solid understanding of object-oriented programming and HDFS concepts"
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage with stakeholders to gather requirements and provide updates on project progress, ensuring alignment with business objectives and fostering a collaborative environment for innovation and efficiency.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate knowledge sharing sessions to enhance team capabilities.
- Monitor project progress and ensure adherence to timelines and quality standards.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in SAP ABAP Development for HANA.
- Strong understanding of application design principles and methodologies.
- Experience with performance tuning and optimization of ABAP programs.
- Familiarity with SAP HANA database concepts and data modeling.
- Ability to troubleshoot and resolve technical issues efficiently.
Additional Information:
- The candidate should have minimum 5 years of experience in SAP ABAP Development for HANA.
- This position is based at our Bengaluru office.
- A 15 years full time education is required."
Responsibilities
As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding the team in implementing effective solutions. You will also engage with stakeholders to gather requirements and provide updates on project progress, ensuring alignment with business objectives and fostering a collaborative environment for innovation and efficiency.
Roles & Responsibilities:
- Expected to be an SME.
- Collaborate and manage the team to perform.
- Responsible for team decisions.
- Engage with multiple teams and contribute on key decisions.
- Provide solutions to problems for their immediate team and across multiple teams.
- Facilitate knowledge sharing sessions to enhance team capabilities.
- Monitor project progress and ensure adherence to timelines and quality standards.
Professional & Technical Skills:
- Must To Have Skills: Proficiency in SAP ABAP Development for HANA.
- Strong understanding of application design principles and methodologies.
- Experience with performance tuning and optimization of ABAP programs.
- Familiarity with SAP HANA database concepts and data modeling.
- Ability to troubleshoot and resolve technical issues efficiently.
Additional Information:
- The candidate should have minimum 5 years of experience in SAP ABAP Development for HANA.
- This position is based at our Bengaluru office.
- A 15 years full time education is required."
Salary : Rs. 0.0 - Rs. 2,00,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Title: Developer
Work Location: Chennai TN and Bangalore KA
Skill Required: Digital : Python~Digital : Machine Learning~Generative AI
Experience Range in Required Skills: 4-6 years
Job Description: python,Gen AI,ML
Essential Skills: python,Gen AI,ML
Responsibilities
Job Title: Developer
Work Location: Chennai TN and Bangalore KA
Skill Required: Digital : Python~Digital : Machine Learning~Generative AI
Experience Range in Required Skills: 4-6 years
Job Description: python,Gen AI,ML
Essential Skills: python,Gen AI,ML
Salary : Rs. 70,000.0 - Rs. 1,30,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Experienced Senior Data Engineer with a strong background in PySpark, Python, and real-time data processing. The ideal candidate will have hands-on experience in building and maintaining scalable data applications, working with large-scale datasets, and implementing machine learning algorithms to Match, Merge and Enrich . Expertise in AWS DynamoDB, ElasticSearch, and match algorithms is essential.
Key Responsibilities:
• Design, develop, and maintain robust data pipelines using PySpark (Spark Streaming) and Python.
• Handle large-scale structured and semi-structured data ingestion, transformation, and enrichment from diverse sources.
• Implement and optimize match, merge, and enrich algorithms for high-volume data processing.
• Apply machine learning libraries and custom-built algorithms for real-time analytics and decision-making.
• Architect and manage end-to-end data systems, including data modeling, storage, and visualization.
• Work with AWS services, particularly DynamoDB and ElasticSearch, to support scalable and efficient data operations.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Ensure data quality, integrity, and security across all stages of the data lifecycle.
• Excellent problem-solving skills and ability to guide and mentor the team on technical implementations and problem solving.
________________________________________
Required Skills & Qualifications:
• Strong proficiency in PySpark, Spark Streaming, and Python. Minimum of 4+ years of experience in PySpark
• Experience in real-time data processing and large-scale data systems.
• Hands-on experience with AWS EMR (Dynamics connector), DynamoDB and ElasticSearch.
• Experience implementing match algorithms and working with data enrichment workflows.
• Solid understanding of data modeling, ETL processes, and data visualization tools.
• Familiarity with machine learning frameworks and integrating ML into data pipelines.
• Strong communication and documentation skills.
• Experience with additional AWS services (e.g., S3, Lambda, Glue).
• Enterprise Service Bus (ESB) reading, good to have MS Dynamics
• Exposure to CI/CD pipelines and DevOps practices. Good to have Concourse.
• Knowledge of data governance and compliance standards.
Responsibilities
Experienced Senior Data Engineer with a strong background in PySpark, Python, and real-time data processing. The ideal candidate will have hands-on experience in building and maintaining scalable data applications, working with large-scale datasets, and implementing machine learning algorithms to Match, Merge and Enrich . Expertise in AWS DynamoDB, ElasticSearch, and match algorithms is essential.
Key Responsibilities:
• Design, develop, and maintain robust data pipelines using PySpark (Spark Streaming) and Python.
• Handle large-scale structured and semi-structured data ingestion, transformation, and enrichment from diverse sources.
• Implement and optimize match, merge, and enrich algorithms for high-volume data processing.
• Apply machine learning libraries and custom-built algorithms for real-time analytics and decision-making.
• Architect and manage end-to-end data systems, including data modeling, storage, and visualization.
• Work with AWS services, particularly DynamoDB and ElasticSearch, to support scalable and efficient data operations.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Ensure data quality, integrity, and security across all stages of the data lifecycle.
• Excellent problem-solving skills and ability to guide and mentor the team on technical implementations and problem solving.
________________________________________
Required Skills & Qualifications:
• Strong proficiency in PySpark, Spark Streaming, and Python. Minimum of 4+ years of experience in PySpark
• Experience in real-time data processing and large-scale data systems.
• Hands-on experience with AWS EMR (Dynamics connector), DynamoDB and ElasticSearch.
• Experience implementing match algorithms and working with data enrichment workflows.
• Solid understanding of data modeling, ETL processes, and data visualization tools.
• Familiarity with machine learning frameworks and integrating ML into data pipelines.
• Strong communication and documentation skills.
• Experience with additional AWS services (e.g., S3, Lambda, Glue).
• Enterprise Service Bus (ESB) reading, good to have MS Dynamics
• Exposure to CI/CD pipelines and DevOps practices. Good to have Concourse.
• Knowledge of data governance and compliance standards.
Salary : Rs. 0.0 - Rs. 1,80,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance