#107007 Expert Data Engineer Closed

  • Long term (more than 6 months)
Description:

We are looking for Expert Data Engineer to join our client’s team on a subcontract basis.

Our client is an exciting tech company and the leading sports betting provider in Germany and globally. Our mission is to electrify the sports betting experience for every customer, every bet. is a product organization at its core.
All of the clients functions are aligned towards delivering the best end-to-end product experience to customers. Product and technology teams work hand-in-hand to advance the development of client’s products and services.

Should have:
  • 5+ years of experience coding in Python, SQL, with solid CS fundamentals including data structure and algorithm design
  • Experience on large-scale production databases and SQL
  • Time-series/analytics databases, preferably Elasticsearch or OpenSearch
  • Experience either with:
1) ETL: Experience with ETL development (extractions, data load, aggregation, ideally experience with Talend).
or/and
2) Real time engineering: Hands-on implementation experience/ familiarity working with a combination of the following technologies: Hadoop, NIFI, Kafka, Spark, and Hudi
  • 2+ years of experience in cloud data platforms - AWS: S3, Athena, EC2, RedShift, EMR, and Lambda
  • Experience with Docker/Kubernetes
  • Knowledge of SQL and MPP databases (e.g. Vertica, Netezza, Greenplum, Aster Data)
  • Knowledge of professional software engineering best practices for the full software
  • Knowledge of Data Warehousing, design, implementation and optimization
  • Knowledge of Data Quality testing, automation and results visualization
  • Knowledge of BI reports and dashboards design and implementation
  • Knowledge of development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
  • Experience participating in an Agile software development team, e.g. SCRUM
  • Experience designing, documenting, and defending designs for key components in large
  • distributed computing systems
  • A consistent track record of delivering exceptionally high quality software on large, complex,
  • cross-functional projects
  • Demonstrated ability to learn new technologies quickly and independently
  • Ability to handle multiple competing priorities in a fast-paced environment
  • Undergraduate degree in Computer Science or Engineering from a top CS program required.
  • Masters preferred
  • Experience with supporting data scientists and complex statistical usecases highly desirable
  • Technology Stack: Python, SQL, AWS
  • Collaborative, willing to help, talkative, able to clearly express and discuss ideas
  • Curious mind and willingness to work with client in consultative manner to find areas to improve
  • Good analytical skills
  • Good team player, motivated to develop and solve complex tasks
  • Self-motivated, self-disciplined and result-oriented
  • Strong attention to details and accuracy
  • English - Upper-Intermediate+
  • Location: Spain, Poland, Bulgaria, Romania

Nice to have:
  • Experience with Airflow
  • Coding experience with Java/Scala
  • Familiarity with Apache Hudi

Responsibilities:
  • Responsible for the building, deployment, and maintenance of mission critical analytics solutions that process data quickly at big data scales
  • Contributes design, code, configurations, and documentation for components that manage data ingestion, real time streaming, batch processing, data extraction, transformation, and loading across multiple data storages
  • Owns one or more key components of the infrastructure and works to continually improve it, identifying gaps and improving the platform’s quality, robustness, maintainability, and speed
  • Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members
  • Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability
  • Performs development, QA, and dev-ops roles as needed to ensure total end to end responsibility of solutions
  • Works directly with business analysts and data scientists to understand and support their usecases
  • Contribute in CoE activities and community building, participate in conferences, provide excellence in exercise and best practices
  • Help in sales activities, customer meetings and digital services

We offer:
  • The full-time workload on a remote basis
  • Start - ASAP
  • Duration - 6+ months

If you think you are the right person, we'd welcome your application!

Contact me to get more details at as@it-outstaffing.com

Skills
  • Python
  • IBM Netezza
  • ETL
  • Kafka
  • Java6
  • Docker
  • Hadoop
  • SQL
  • Redshift
  • Apache Airflow
  • Amazon Elastic Compute Cloud
  • AWS Lambda
  • Kubernetes
  • CSS
  • Vertica
  • Elasticsearch
  • Greenplum
  • Spark
  • Scala

Similar requests

Suggested Vacancies

Duration: Long term (more than 6 months)

Start from: 31 Aug 2022

We are looking for a Senior Neo4j / Java Developer to join our client’s team on a subcontract basis. It's a product whose MVP will be released in ~1 month. Our team developed it from scratch, inclu...

Duration: Long term (more than 6 months)

Start from: ASAP

We are looking for Senior Automation QA Engineer to join our client’s team on a subcontract basis. We are looking for an Engineer to design and implement methodologies to automate testing on our en...

Duration: Long term (more than 6 months)

Start from: ASAP

Location: United Kingdom | Coventry

We are looking for a Senior Data Analytic to join our client’s team on a subcontract basis. A proven analytics professional, you'll be an outstanding problem solver, inquisitive and excellent with ...