We found 730 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Key Responsibilities • Develop and maintain server-side applications using Node.js • Design and implement RESTful APIs • Work closely with front-end developers and other stakeholders to integrate user-facing elements • Write clean, efficient, and reusable code • Troubleshoot and debug applications • Ensure performance, quality, and responsiveness of applications • Participate in code reviews and technical discussions • Collaborate with cross-functional teams to define and deliver new features

Responsibilities

Key Responsibilities • Develop and maintain server-side applications using Node.js • Design and implement RESTful APIs • Work closely with front-end developers and other stakeholders to integrate user-facing elements • Write clean, efficient, and reusable code • Troubleshoot and debug applications • Ensure performance, quality, and responsiveness of applications • Participate in code reviews and technical discussions • Collaborate with cross-functional teams to define and deliver new features
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Node JS developer

Job Description

Databricks, Azure Data Factory

Responsibilities

Databricks, Azure Data Factory
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory

Job Description

"Requirement for Snowflake, Azure/AWS admin, DevOps resource for DTNA. Sharing the JD below: • Azure Adminitstration- Azure resource deployment/management using IaC bicep or Terraform. Azure Networking. IAM. • AWS Administration- AWS resource deployment/management using IaC Terraform, AWS CLI, Cloud Formation. AWS Networking. IAM. • Azure DevOps- Automation of IaC using Azure Pipelines. Collaborate with development teams to support (CI/CD) processes. • Bicep- For creating IaC code for Azure resources. • Terraform- For creating IaC code for Azure/AWS resources. • Git Actions- Automation of IaC using Git Actions. • Snowflake- Management of snowflake objects like Database, Table, Tasks, Integrations, Pipes etc, access management. • Schemachange- For creating IaC code for deploying/managing Snowflake objects. • Matillion- Administration of Matillion Designer. Management of Matillion agents."

Responsibilities

  • Salary : Rs. 0.0 - Rs. 15.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Devops Engineer - Sandip

Job Description

Job Description: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Essential Skills: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Desirabla Skills : A great team player, with the ability to perfectly integrate and play a pivotal role in a team that includes also data scientists, data engineers and data analysts? Able to appreciate short term vs. long term goals and take both tactical and strategic decisions? Great communication skills, ability to communicate complex technical concepts to a non-technical audience? Strong organizational skills, the ideal candidate has the ability to work in a fast-paced environment and has the ability to quickly adapt to changing priorities? Stays on top of the latest trends and develops expertise in emerging cloud technologies. Works well as a technical leader and individual contributor.? Build processes supporting data transformation, data structures, metadata, dependency, and workload management. ? Guide the development of an Agile data development experience, when required using DW automation tools. ? Assist the project team in planning and execution to achieve daily and sprint goals. ? Articulate documentation of project artefacts and deliverables. ? Identifies and communicates risks clearly to the Product owner and the team. ? Education of customer on how the technology recommendation/solution planned will fulfil the customer expressed needs

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory

Job Description

Required Skills & Qualifications: • Proficiency in Microsoft Excel (including formulas, charts, and macros). • Familiarity with tools like ChatGPT, Claude, Grok to quicky generate code for automation scripts. • Basic to intermediate coding skills (Python, VBA, or similar). • Experience with data entry and data management. • Strong organizational and time management skills. • Excellent written and verbal communication skills. • Ability to work independently and as part of a team. Preferred Qualifications: • Bachelor’s degree in Computer Science, Business, Data Analytics, or a related field. • Prior experience in a project coordination or data-focused role. • Understanding of basic project management principles.

Responsibilities

Required Skills & Qualifications: • Proficiency in Microsoft Excel (including formulas, charts, and macros). • Familiarity with tools like ChatGPT, Claude, Grok to quicky generate code for automation scripts. • Basic to intermediate coding skills (Python, VBA, or similar). • Experience with data entry and data management. • Strong organizational and time management skills. • Excellent written and verbal communication skills. • Ability to work independently and as part of a team. Preferred Qualifications: • Bachelor’s degree in Computer Science, Business, Data Analytics, or a related field. • Prior experience in a project coordination or data-focused role. • Understanding of basic project management principles.
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Project Assistant – Data & Operations

Job Description

Key Responsibilities: • Requirement Analysis: Collaborate with business stakeholders to understand use cases and transform them into technical solutions based on graph modeling. • Data Modeling: Design and implement graph data models using Neo4j, ensuring they meet both current and future requirements. • Query Development: Write optimized Cypher queries and procedures for complex graph traversals and data extraction. • Integration & ETL: Integrate Neo4j with existing enterprise systems (e.g., Snowflake, Azure Data Factory, REST APIs) using ETL pipelines. • Performance Tuning: Analyze query performance and implement strategies to optimize performance and scalability. • Deployment Support: Provide support in deploying Neo4j clusters in cloud or hybrid environments and ensure data security and backups. • Documentation & Training: Document technical designs, data models, and workflows; provide knowledge transfer to internal teams. Required Skills: • 7+ years of experience in data engineering or database development • 4+ years hands-on experience with Neo4j and Cypher • Strong understanding of graph theory and data modeling • Experience in integrating Neo4j with data pipelines ( Kafka, Spark, ADF, APIs) • Familiarity with Neo4j Bloom, APOC procedures, and Graph Data Science Library (GDSL) is a plus • Experience of cloud platforms (Azure preferred) • Strong communication and stakeholder management skills • Comfortable working in Europe CET time zone

Responsibilities

Key Responsibilities: • Requirement Analysis: Collaborate with business stakeholders to understand use cases and transform them into technical solutions based on graph modeling. • Data Modeling: Design and implement graph data models using Neo4j, ensuring they meet both current and future requirements. • Query Development: Write optimized Cypher queries and procedures for complex graph traversals and data extraction. • Integration & ETL: Integrate Neo4j with existing enterprise systems (e.g., Snowflake, Azure Data Factory, REST APIs) using ETL pipelines. • Performance Tuning: Analyze query performance and implement strategies to optimize performance and scalability. • Deployment Support: Provide support in deploying Neo4j clusters in cloud or hybrid environments and ensure data security and backups. • Documentation & Training: Document technical designs, data models, and workflows; provide knowledge transfer to internal teams. Required Skills: • 7+ years of experience in data engineering or database development • 4+ years hands-on experience with Neo4j and Cypher • Strong understanding of graph theory and data modeling • Experience in integrating Neo4j with data pipelines ( Kafka, Spark, ADF, APIs) • Familiarity with Neo4j Bloom, APOC procedures, and Graph Data Science Library (GDSL) is a plus • Experience of cloud platforms (Azure preferred) • Strong communication and stakeholder management skills • Comfortable working in Europe CET time zone
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Neo4j Consultant

Job Description

Java/J2EE, Spring Boot, Spring Framework, REST API, Angular 13.16, Good to have AWS knowledge/experience

Responsibilities

Java/J2EE, Spring Boot, Spring Framework, REST API, Angular 13.16, Good to have AWS knowledge/experience
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Java/J2EE

Job Description

About the Role: • • • Interpret creative briefs to develop the right creatives for marketing campaigns • for Shopsy, along with landing pages • • • • Edit videos and develop creative motion graphics for multiple platforms • • • • Design creatives for Shopsy Home Page, Category Page, CRM, Social, etc. • • • • Revise designs based on learnings from previous campaigns and communications • • • • Collaborate with copywriters and other internal and external stakeholders to create • the right customer experience as well as deliver business results • • • • Ensure the final output is aesthetically good, engaging and explains the proposition • to the end customer in the best possible way • • • • Drive creative innovation and trend storytelling via constant research on global • graphic design, colour, typography & trends. • You are Responsible for: Innovative and engaging design creations to aid Brand Marketing, CRM and Performance Marketing To succeed in this role – you should • • • Have a creative, collaborative, and innovative mindset and a go-getter attitude • Mandatory Skills • • • Graphic/Brand Designing • • • • Hands-on experience curating visuals, animations & editing videos (chroma removal, masking etc.) • • • • Proficient in Adobe software - Photoshop, Illustrator, After Effects. Proficiency in design principles (Typography, • Composition, Scale, and Color Theory, amongst others) and working knowledge of how those principles work together. • • • • An online portfolio demonstrating strong visual and motion design skills and advertising experience across a range • of media, including online, mobile or print, is required • • • • Attention to detail • Desirable Skills 1. Innovation 2. Design Thinking 3. Presentation Skills

Responsibilities

About the Role: • • • Interpret creative briefs to develop the right creatives for marketing campaigns • for Shopsy, along with landing pages • • • • Edit videos and develop creative motion graphics for multiple platforms • • • • Design creatives for Shopsy Home Page, Category Page, CRM, Social, etc. • • • • Revise designs based on learnings from previous campaigns and communications • • • • Collaborate with copywriters and other internal and external stakeholders to create • the right customer experience as well as deliver business results • • • • Ensure the final output is aesthetically good, engaging and explains the proposition • to the end customer in the best possible way • • • • Drive creative innovation and trend storytelling via constant research on global • graphic design, colour, typography & trends. • You are Responsible for: Innovative and engaging design creations to aid Brand Marketing, CRM and Performance Marketing To succeed in this role – you should • • • Have a creative, collaborative, and innovative mindset and a go-getter attitude • Mandatory Skills • • • Graphic/Brand Designing • • • • Hands-on experience curating visuals, animations & editing videos (chroma removal, masking etc.) • • • • Proficient in Adobe software - Photoshop, Illustrator, After Effects. Proficiency in design principles (Typography, • Composition, Scale, and Color Theory, amongst others) and working knowledge of how those principles work together. • • • • An online portfolio demonstrating strong visual and motion design skills and advertising experience across a range • of media, including online, mobile or print, is required • • • • Attention to detail • Desirable Skills 1. Innovation 2. Design Thinking 3. Presentation Skills
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Graphics + Motion Designer

Job Description

Job Description: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Essential Skills: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Desirabla Skills : A great team player, with the ability to perfectly integrate and play a pivotal role in a team that includes also data scientists, data engineers and data analysts? Able to appreciate short term vs. long term goals and take both tactical and strategic decisions? Great communication skills, ability to communicate complex technical concepts to a non-technical audience? Strong organizational skills, the ideal candidate has the ability to work in a fast-paced environment and has the ability to quickly adapt to changing priorities? Stays on top of the latest trends and develops expertise in emerging cloud technologies. Works well as a technical leader and individual contributor.? Build processes supporting data transformation, data structures, metadata, dependency, and workload management. ? Guide the development of an Agile data development experience, when required using DW automation tools. ? Assist the project team in planning and execution to achieve daily and sprint goals. ? Articulate documentation of project artefacts and deliverables. ? Identifies and communicates risks clearly to the Product owner and the team. ? Education of customer on how the technology recommendation/solution planned will fulfil the customer expressed needs

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory