We found 1165 jobs matching to your search

Advance Search

Skills

Locations

Experience

Job Description

Strong understanding of the Display Advertising Ecosystem, Ad networks, rich media vendors, RTB/Programmatic media buying. Govern process to drive Quality assurance in campaign deliveries. Excellent Microsoft Excel skills (but not limited to) Vlookup/ Sumif/ iferror and pivot tables. 2+ years of Ad Ops experience

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :DSP, Google Ads, Google Ads Editor, Social Marketing, Excel Pivot Formulas

Job Description

Job Description: Hands-on automation experience using Java Selenium. Experience in testing web-based systems. Candidate must have hands-on experience working with automating dynamic web pages/components to support interactive, responsive user experiences. Candidate must have strong troubleshooting skills Candidate must be very detailed focused with an emphasis on quality & end-to-end testing. Outstanding knowledge with the principles of designing easily modifiable and extensible automated tests. Ability to write complex queries to perform data validations or set-up test data.CI/CD using Jenkins Selenium, Appium, Cucumber Proficiency in JavaScript. Essential Skills: Test Automation~Selenium Desirable Skills: Hands-on automation experience using Selenium:Experience in testing web based systems. Candidate must have hands-on experience working with automating dynamic web pages/components to support interactive, responsive user experiences. .Candidate must have strong troubleshooting skills Candidate must be very detailed focused with an emphasis on quality & end-to-end testing. Outstanding knowledge with the principles of designing easily modifiable and extensible automated tests Ability to write complex queries to perform data validations or set-up test data.CI/CD using JenkinsSelenium, Appium, CucumberProficiency in JavaScript, SQL, and testing for REST services

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Test Automation, Selenium

Job Description

Job Description: Power Automate / Power platform / PAD Looking for candidates with strong hands-on experience in Power Automate, Power Platform, and Power Automate Desktop (PAD). Should be capable of designing and implementing end-to-end automation solutions.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microsoft Power Platform

Job Description

Job Description: Power Automate / Power platform / PAD Looking for candidates with strong hands-on experience in Power Automate, Power Platform, and Power Automate Desktop (PAD). Should be capable of designing and implementing end-to-end automation solutions.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Microsoft Power Platform

Job Description

Job Description: AS400 developer-RPG, RPG CL AS400 RPG Developer - designing, developing, and maintaining applications on the IBM iSeries (AS/400) platform using RPG programming languages, including free-format RPG, and CL.

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :AS400_CLP

Job Description

Key Responsibilities • Develop and maintain server-side applications using Node.js • Design and implement RESTful APIs • Work closely with front-end developers and other stakeholders to integrate user-facing elements • Write clean, efficient, and reusable code • Troubleshoot and debug applications • Ensure performance, quality, and responsiveness of applications • Participate in code reviews and technical discussions • Collaborate with cross-functional teams to define and deliver new features

Responsibilities

Key Responsibilities • Develop and maintain server-side applications using Node.js • Design and implement RESTful APIs • Work closely with front-end developers and other stakeholders to integrate user-facing elements • Write clean, efficient, and reusable code • Troubleshoot and debug applications • Ensure performance, quality, and responsiveness of applications • Participate in code reviews and technical discussions • Collaborate with cross-functional teams to define and deliver new features
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Node JS developer

Job Description

Databricks, Azure Data Factory

Responsibilities

Databricks, Azure Data Factory
  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory

Job Description

"Requirement for Snowflake, Azure/AWS admin, DevOps resource for DTNA. Sharing the JD below: • Azure Adminitstration- Azure resource deployment/management using IaC bicep or Terraform. Azure Networking. IAM. • AWS Administration- AWS resource deployment/management using IaC Terraform, AWS CLI, Cloud Formation. AWS Networking. IAM. • Azure DevOps- Automation of IaC using Azure Pipelines. Collaborate with development teams to support (CI/CD) processes. • Bicep- For creating IaC code for Azure resources. • Terraform- For creating IaC code for Azure/AWS resources. • Git Actions- Automation of IaC using Git Actions. • Snowflake- Management of snowflake objects like Database, Table, Tasks, Integrations, Pipes etc, access management. • Schemachange- For creating IaC code for deploying/managing Snowflake objects. • Matillion- Administration of Matillion Designer. Management of Matillion agents."

Responsibilities

  • Salary : Rs. 0.0 - Rs. 15.0
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Devops Engineer - Sandip

Job Description

Job Description: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Essential Skills: Deep Microsoft Azure product knowledge and understanding of product features.? Proven track record in designing, developing and implementing data modelling, at a conceptual, logical and physical level.? Experience with accountability for E2E Data Architecture and Lifecycle (including non-functional requirements and operations) management.? Understands industry recognised data modelling patterns and standards? Understands overall IT system design, in particular networking, authorization and authentication protocols, data security, disaster recovery.? Knowledge of the latest technologies to process large amount of data (Apache Spark; Databricks is a plus)? Experience in MDM, Metadata Management, Data Quality and Data Lineage tools.? Expertise in cloud security and identity best practices? A deep understanding of ETL and ELT tools and techniques? Exposure to other cloud solutions (Google, AWS) a plus.? Strong experience in architecting ?fit-for-purpose? and ?fit-for-use? complex solutions on the Microsoft Azure cloud stack [IaaS, PaaS and SaaS] ? Very strong hands-on experience in using Microsoft Azure services like Azure Data Factory, Data brick services (like processing streaming data using Spark clusters), usage of Blob containers, ESB, Event Grid, Azure SQL server, Cosmos DB, Azure functions, Analytics (like Power BI) is a mandatory requirement.? Excellent understanding of architecture patterns and best practices with proven innovative and leading-edge thinking to build cloud-ready systems (e.g., Domain Driven Architecture, multi-tenancy, building for resilience, scalability, performance, and federation, microservices etc.) ? Hands-on work experience on application platforms like .Net, Python, Java etc. preferred ? Understanding of big data platforms concepts and NoSQL technologies, and ability to map such? The candidate should have experience in modeling data for traditional structured data, as well as information design in Data Lake and NoSQL database engines. ? Fluent with modelling, development and project execution processes around star, snowflake, 3NF, and Data Vault approaches. ? Good experience in working with data using SQLs, Python and Scala in Microsoft based resources like Azure Synapse and Data Lake. ? Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Desirabla Skills : A great team player, with the ability to perfectly integrate and play a pivotal role in a team that includes also data scientists, data engineers and data analysts? Able to appreciate short term vs. long term goals and take both tactical and strategic decisions? Great communication skills, ability to communicate complex technical concepts to a non-technical audience? Strong organizational skills, the ideal candidate has the ability to work in a fast-paced environment and has the ability to quickly adapt to changing priorities? Stays on top of the latest trends and develops expertise in emerging cloud technologies. Works well as a technical leader and individual contributor.? Build processes supporting data transformation, data structures, metadata, dependency, and workload management. ? Guide the development of an Agile data development experience, when required using DW automation tools. ? Assist the project team in planning and execution to achieve daily and sprint goals. ? Articulate documentation of project artefacts and deliverables. ? Identifies and communicates risks clearly to the Product owner and the team. ? Education of customer on how the technology recommendation/solution planned will fulfil the customer expressed needs

Responsibilities

  • Salary : As per industry standard.
  • Industry :IT-Software / Software Services
  • Functional Area : IT Software - Application Programming , Maintenance
  • Role Category :Programming & Design
  • Role :Databricks, Azure Data Factory