ABAP Developer
6-9 years
DescriptionDeveloping ABAP solutions as per the design document requirements (workflow and PI forinterfaces)Knowledge and understanding of SAP Fiori and S/4 HANA is an added advantageParticipate in complex code reviews and provide feedbackABAP Webdynpro Model generation (RFC, Web Services), Internal/External Context Mapping,Eventing, Navigation, Adaptive RFC, Custom Comps, Interfaces, Usage, Net WeaverDevelopment Interface (NWDI Tracks, SCs, DCs) and optionally Floor Plan Manager (FPM)Involvement in project work for various SAP modulesCooperation with portal development teamABAP support for SAP implementation in European countriesInteracting with the Architects, Analysts and other IT delivery teams to understand the functionalrequirementsDeveloping and implementing changes/enhancements using ABAP technology for BAU andsmall change aligned to SAP standardProviding technical input to projects delivering SAP changes to support initial project planningand estimation process
Responsibilities
ABAP Developer
6-9 years
DescriptionDeveloping ABAP solutions as per the design document requirements (workflow and PI forinterfaces)Knowledge and understanding of SAP Fiori and S/4 HANA is an added advantageParticipate in complex code reviews and provide feedbackABAP Webdynpro Model generation (RFC, Web Services), Internal/External Context Mapping,Eventing, Navigation, Adaptive RFC, Custom Comps, Interfaces, Usage, Net WeaverDevelopment Interface (NWDI Tracks, SCs, DCs) and optionally Floor Plan Manager (FPM)Involvement in project work for various SAP modulesCooperation with portal development teamABAP support for SAP implementation in European countriesInteracting with the Architects, Analysts and other IT delivery teams to understand the functionalrequirementsDeveloping and implementing changes/enhancements using ABAP technology for BAU andsmall change aligned to SAP standardProviding technical input to projects delivering SAP changes to support initial project planningand estimation process
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
AWS Databricks
Job Description:
Databricks Engineer : -
· Working knowledge of the following: Python, Databricks, SQL, Unix
· Should have extensive hands-on experience in AWS services such as Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, and IAM.
· Experience working with SPARK and real time analytic frameworks.
· Documented experience in a business intelligence or analytic development role on a variety of large-scale projects.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
8 HDC4 Summary: As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve collaborating with cross-functional teams to design, code, and enhance various components across systems or applications. You will utilize modern frameworks and agile practices to ensure the delivery of scalable and high-performing solutions, while also addressing any challenges that arise during the development process. Your role will require a proactive approach to problem-solving and a commitment to continuous improvement in software development practices. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Machine Learning.- Strong understanding of software development methodologies and best practices.- Experience with modern frameworks and tools for software development.- Ability to write clean, maintainable, and efficient code. Additional Information: - The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required.
Responsibilities
8 HDC4 Summary: As a Custom Software Engineer, you will engage in the development of custom software solutions that are designed to meet specific business needs. Your typical day will involve collaborating with cross-functional teams to design, code, and enhance various components across systems or applications. You will utilize modern frameworks and agile practices to ensure the delivery of scalable and high-performing solutions, while also addressing any challenges that arise during the development process. Your role will require a proactive approach to problem-solving and a commitment to continuous improvement in software development practices. Roles & Responsibilities: - Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Good To Have Skills: Experience with Machine Learning.- Strong understanding of software development methodologies and best practices.- Experience with modern frameworks and tools for software development.- Ability to write clean, maintainable, and efficient code. Additional Information: - The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based at our Hyderabad office.- A 15 years full time education is required.
Salary : Rs. 0.0 - Rs. 2,50,000.0
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
: Python
Forgerock Identity Management Role Descriptions: ResponsibilitiesDevelop and maintain software applications using Python and SQL.Design and implement efficient database solutions.Collaborate with cross-functional teams to understand requirements and deliver high-quality solutions.Write clean| maintainable| and scalable code.Conduct code reviews and provide constructive feedback.Analyze complex datasets and generate actionable insights.Monitor and optimize application performance.Develop technical documentation and provide support.QualificationsBachelors degree in Computer Science| Engineering| or a related field.Proven experience as a Python Developer.Strong understanding of SQL and relational databases.Experience with data modeling| data warehousing| and ETL processes.Excellent problem-solving skills and attention to detail.Good communication and teamwork abilities.SkillsPythonSQLData ModelingETLDatabase DesignVersion Control (e.g.| Git)Unit TestingAgile Methodologies
Essential Skills: Created by CRWDT for (Python SQL)
Responsibilities
Forgerock Identity Management Role Descriptions: ResponsibilitiesDevelop and maintain software applications using Python and SQL.Design and implement efficient database solutions.Collaborate with cross-functional teams to understand requirements and deliver high-quality solutions.Write clean| maintainable| and scalable code.Conduct code reviews and provide constructive feedback.Analyze complex datasets and generate actionable insights.Monitor and optimize application performance.Develop technical documentation and provide support.QualificationsBachelors degree in Computer Science| Engineering| or a related field.Proven experience as a Python Developer.Strong understanding of SQL and relational databases.Experience with data modeling| data warehousing| and ETL processes.Excellent problem-solving skills and attention to detail.Good communication and teamwork abilities.SkillsPythonSQLData ModelingETLDatabase DesignVersion Control (e.g.| Git)Unit TestingAgile Methodologies
Essential Skills: Created by CRWDT for (Python SQL)
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Job Description:
Databricks Engineer : -
· Working knowledge of the following: Python, Databricks, SQL, Unix
· Should have extensive hands-on experience in AWS services such as Lambda, Step Functions, CloudTrail, CloudWatch, SNS, SQS, S3, VPC, EC2, RDS, and IAM.
· Experience working with SPARK and real time analytic frameworks.
· Documented experience in a business intelligence or analytic development role on a variety of large-scale projects.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
Role Category :Programming & Design
Role : Skill Required: Digital : Amazon Web Service(AWS) Cloud Computing~Digital : Databricks
Define project scope, objectives, and deliverables in collaboration with stakeholders.
Develop detailed project plans, including timelines, milestones, and resource allocation.
Serve as the primary point of contact between the project team and stakeholders.
Manage stakeholder expectations and ensure alignment with project goals.
Provide technical oversight and guidance to ensure project deliverables meet technical standards and requirements in Finacle CBS.
Responsibilities
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance
PRIMARY PURPOSE: Engineering graduate (Computer / IT engineering) with an experience of 3 to 7 y, who is passionate and eager to learn and contribute. Basic requirement would be to develop Power BI. great command over the tools (PowerBI/DAX studio). Hands-on experience in optimization and performance tuning. Strong understanding of RDBMS. Should have experience with Snowflake DWH. Preferred UI/UX experience.
DUTIES & RESPONSIBILITIES:
1. Build reports based out of Snowflake views.
2 Load balancing between PowerBI and Sources with very strong SQL knowledge.
3 Create/ maintain / optimize PowerBI using DAX studio, Snowflake. Programing language
4 ¬Understand Business requirement and deliver solution as expected.
5 Should be able to develop tabular and multidimensional models that are compatible with data warehouse standards.
6 Capable of implementing row-level security on data along with an understanding of application security layer models in Power BI.
7 Must be able to develop custom BI products that require knowledge of scripting languages and programming languages like R and Python.
8 Identify key performance indicators (KPIs) with clear objectives and consistently monitor those
Knowledge • Knowledge of Manufacturing, production.
• Knowledge of Supply chain processes.
• Knowledge of Microsoft Excel with the ability to understand and work with formulas and formats submitted by customer.
Technical Skills • Power BI.
• DAX.
• Fabric
• SnowFlake
• SQL
• Excel.
• R/Python scripting.
Additional Skills • Tableau
• Tableau to Power BI migration experience.
• Data pipeline
• UI/UX
Responsibilities
Questionnaire:
Query Scale-out in Power BI Premium Capacity?
a. What is Query Scale-out creates multiple read replicas of a semantic model to handle concurrent queries while separating refresh workloads from user queries.
2. What is Large Semantic Model Storage Format?
a. It allows datasets larger than 10 GB by storing data in OneLake and loading only frequently accessed data into memory, reducing capacity memory usage.
3. Difference Between Dataflows Gen 1 and Dataflows Gen 2?
a. Dataflows Gen1 are Power BI–centric and store data in ADLS, while Gen2 are Fabric-native, store data in OneLake, and support multiple destinations like Lakehouse and Warehouse.
4. What is the difference between Fabric Lakehouse and Fabric Warehouse?
a. Lakehouse supports both structured and unstructured data using Spark and Delta, while Warehouse is SQL-first, optimized for relational analytics and BI workloads.
5. What is Shortcut in OneLake?
a. A Shortcut is a virtual reference to external storage (ADLS, AWS S3) that allows Fabric to access data without copying it.
6. Explain the difference between XMLA endpoint and Direct Lake access?
a. XMLA endpoint allows model management, metadata access, and DAX queries externally, while Direct Lake queries actual OneLake Delta data directly without loading it into VertiPaq.
Salary : As per industry standard.
Industry :IT-Software / Software Services
Functional Area : IT Software - Application Programming , Maintenance