Data & Analytics Lead
We’ve made it our business to understand how people want to work today, tomorrow and beyond and as you’d expect, we practise... more info
What makes Cognizant a unique place to work? The combination of rapid growth and an international and innovative environment! This is creating many opportunities for people like YOU — people with an entrepreneurial spirit who want to make a difference in this world.At Cognizant, together with your colleagues from all around the world, you will collaborate on creating solutions for the world's leading companies and help them become more flexible, more innovative, and successful. Moreover, this is your chance to be part of the success story. Position Summary This position requires someone to Lead the project delivery by ensuring that the roadmap is aligned with client’s strategy and business objectives and solution is implemented within the agreed timeline, scope, and budget. You will also be leading client engagements by identifying their needs and recommending tailored solutions. Work closely with data engineers, analysts, IT teams, and business stakeholders to ensure seamless data flow and solution alignment. Mandatory Skills Bachelor’s or Master’s in Computer Science, Engineering, or related field. 5+ years of experience in designing highly available and scalable data and infrastructure solutions on Cloud platforms such as AWS, Azure. 5+ years of project management and team leadership experience. Excellent communication skills with both technical and non-technical stakeholders. Agile/Scrum experience and ability to manage multiple projects efficiently. Expertise in developing and maintaining complete data lifecycle including data ingestion, data modelling, data warehousing, data governance, data analytics and reporting. 5+ years of experience with Python and SQL. Expertise in AWS services such as DMS, Lambda, Glue jobs, Redshift, S3, Athena, RDS, DynamoDB, SQS, ECS, EventBridge, Step functions, CloudFormation, IAM, VPC, SNS, and CloudWatch and strong hold on serverless and containerisation (Docker, Kubernetes) technologies. 3+ years of experience with Data Build Tool and Data Analytics and Reporting tools such as MS PowerBI. Expertise in DevSecOps concepts such as automation and scripting (Python, SQL), IaC tools (CloudFormation, Terraform), CI/CD tools (Bitbucket pipelines, Jenkins), monitoring tools (New Relic), and security best practices. Proficiency in networking concepts such as DNS, load balancing, firewalls, VPNs, and HTTP/HTTPS protocols. 5+ years of experience in developing and delivering trainings in company for junior team members and peers. Roles And Responsibilities Design and develop highly available and scalable data and infrastructure solutions, and efficient data pipelines on Cloud platforms such as AWS, Azure to collect, transform, and store data from diverse sources, ensuring data quality and accessibility for downstream users such as data analysts and BI tools. Develop and optimise data models including relational, dimensional, and NoSQL schemas to perform data analytics and generate reports as per business requirements. Understand and implement data governance requirements that support data and analytics solutions including data quality, data lineage, data classification, and reusability. Automate repetitive tasks and optimize data workflows for efficiency and scalability. Identify bottlenecks in data processing and optimise for performance. Understanding of cloud cost management and optimization strategies to balance performance with cost. Mentor junior data engineers and data analysts and provide technical guidance on best practices, code reviews, and career development. Maintain strong client satisfaction and foster long term relationships. Developing and delivering impactful presentations and workshops for stakeholders. Support business development efforts through proposal creation and client outreach. Stay up to date with the latest developments in data engineering and analytics tools, technologies, and best practices. Contribute to the evolution of the data engineering and analytics practices within the company. Project Management: Ensure that data engineering and analytics projects are delivered on time, within scope and budget, and with high quality. Manage resource allocation and closely monitor project progress, proactively addressing any challenges that arise. Requirement Gathering: Engage with clients to gain a comprehensive understanding of their needs, challenges, and objectives. Work with business stakeholders to gather requirements for data solutions, translating them into technical specifications. Data Engineering and Warehousing: Lead the development and enhancement of efficient ELT (Extract, Load, Transform) pipelines based of AWS Glue and Apache PySpark jobs for ingesting data from various data sources such as APIs and databases to Redshift data warehouse hosted in AWS environment. Oversee real time data streaming initiatives. Data Modelling: Lead the design and development of SQL data models using DBT tool that support business analytics and reporting initiatives. Continuously assess and improve existing data models to reflect evolving business requirements, new data sources and optimise their performance. Data Analysis and Reporting: Transform the data into meaningful insights using MS PowerBI tool and present it clearly and accurately to business stakeholders so they can make informed decisions. Data Governance: Identify best practices for managing metadata, including data discovery, data lineage, and data cataloguing and implement data governance tools. Solution Architecture Design: Collaborate closely with cross-functional teams to develop and customise strategic and technical solutions. Ensure that data architecture is scalable and reliable to handle growing datasets and workloads. DevSecOps Automation: Adhere to DevSecOps best practices like identity management, secure network communication, CI/CD automation and infrastructure as code deployment, monitoring, logging, and alerting. Mentorship: Mentor junior engineers to help them improve their technical skills and foster career growth with ongoing feedback and knowledge sharing. Effective team leadership to promote efficiency and align team’s objectives with organisational goals. Stakeholder Communication: Provide regular updates to leadership on project progress, team performance, and any issues or risks related to data engineering initiatives to ensure alignment and informed decision-making throughout the project lifecycle. Strategic Planning & Continuous Improvement: Stay up to date with emerging data technologies, industry trends, and best practices to continually improve the team’s capabilities and tools. Consistently evaluate and refine processes to enhance efficiency, effectiveness, and overall team delivery. Qualifications & Certifications: Bachelor’s or Master’s in Computer Science, Engineering, or related field. SCRUMstudy Scrum Master Certified AWS Certified Salary Range: >$100,000Date of Posting: 5th of March, 2025Next Steps: If you feel this opportunity suits you, or Cognizant is the type of organization you would like to join, we want to have a conversation with you! Please apply directly with us.Cognizant is committed to providing Equal Employment Opportunities. Successful candidates will be required to undergo a background check. #J-18808-Ljbffr
We’ve made it our business to understand how people want to work today, tomorrow and beyond and as you’d expect, we practise... more info
DXC Technology (NYSE:DXC) - where brilliant people embrace change and seize opportunities to advance their careers and amplify... more info
At Fresh Clinics, we’re not just supporting healthcare professionals—we’re revolutionising the medical aesthetics industry.... more info