Company Description
Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money.
We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments.
We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com.
Job Description
We seek a highly skilled Staff Engineer with extensive AWS and Big Data technologies expertise to join our Engineering team. The ideal candidate will have a proven track record in designing, developing, and implementing scalable software solutions leveraging AWS cloud services and Big Data platforms. This role requires strong programming skills, deep knowledge of AWS services, and experience with various Big Data tools and frameworks. In this role, you will be responsible for designing, implementing, and maintaining scalable, high-performance data solutions. You will lead complex projects, mentor junior engineers, and drive best practices for data engineering and cloud infrastructure.
Responsibilities:
- Architect and Design: Design and implement scalable data solutions using AWS services (e.g., AWS EMR, AWS Athena, AWS Dynamo DB, AWS Glue, AWS S3) and Big Data technologies (e.g., Hadoop, Spark, Presto).
- Collaboration: Collaborate with cross-functional teams to gather requirements, define architecture, and implement solutions that meet business needs.
- Cloud Infrastructure: Manage and optimize AWS infrastructure to ensure high availability, performance, and cost efficiency. Utilize AWS tools for monitoring, security, and automation.
- Data Pipeline Development: Build and optimize data pipelines to handle large-scale data ingestion, processing, and storage. Ensure data quality and reliability.
- Leadership and Mentoring: Provide technical leadership and mentorship to junior engineers. Lead code reviews and design reviews, and contribute to the development of engineering best practices.
- Performance Optimization: Identify and resolve performance bottlenecks. Implement best practices for data storage, retrieval, and processing.
- Innovation: Stay current with emerging technologies and industry trends. Propose and implement new tools, technologies, and processes to enhance data engineering practices.
- Documentation and Reporting: Create and maintain comprehensive documentation for data architectures, processes, and systems. Provide regular updates and reports to stakeholders.
- Develop Batch applications using Spark & Scala or PySpark.
- Design, develop, and maintain complex data pipelines and workflows using Apache Airflow.
- Configure and manage Airflow environments, including scheduling, monitoring, and troubleshooting workflow execution.
- Implement monitoring, logging, and alerting solutions to ensure system reliability and availability.
Qualifications
- Bachelor's or Master's degree in Computer Science, Engineering, or related field.
- Proven experience as a Software Engineer, with at least 15 years of experience in developing distributed cloud-based applications.
- Strong proficiency in programming languages such as Java, Python, or Scala, and familiarity with modern software development practices and tools.
- AWS certifications such as AWS Certified Solutions Architect or AWS Certified Developer.
- Experience with Big Data frameworks like Hadoop, Spark, Kafka and Athena as well as related tools such as Hive, HBase, and Presto.
Preferred Qualifications:
- AWS certifications such as AWS Certified Solutions Architect or AWS Certified Developer.
- Experience with AWS EMR, Lambda, ECS, Dynamo DB, Apache Airflow.
- Familiarity with DevOps practices and tools for continuous integration, deployment, and automation.
- Knowledge of machine learning concepts and frameworks, and experience in deploying ML models on AWS infrastructure.
- Knowledge with stream processing technologies and real-time data analytics platforms.
Join our team and play a key role in building cutting-edge software solutions that leverage the power of AWS and Big Data to drive innovation and business growth.
Additional Information
Our uniqueness is that we truly celebrate yours. Experian's culture and people are key differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people-first approach is award winning; Great Place To Work in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why.
Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity.
#J-18808-Ljbffr