Descrição:
Publicada: 05/05/2021
About us
We are a company focused on agile technological solutions that simplify and generate real value in the day to day of organizations. We work to evolve your business quickly, consistently, and compatibly with the ever-accelerating changes of our time.
We believe that individuals and their interactions make your and our business. This is our way of doing it: with technology and simplicity.
Responsabilities:
Data preparation ( data ingestion/ transformation) for the other members of the team.
Must Haves:
- 3+ years experience with Hadoop, Spark (PySpark, Scala)
- 3 + years experience with Python for data engineering
- SQL
- Git development workflow
- DevOps/CI-CD tools (JIRA, GitLab, GitLab CI, NEXUS, SonarQube)
- AWS fundamentals (CloudFormation, IAM, Lambda, SQS, Athena, EMR)
- Unix
- AWS Architect Associate or AWS Big Data Certified
Nice to Haves:
- Experience with Airflow
- Docker
Habilidades
- AWS
- SQL
- Unix
Para cada desafio, uma solução.
Descrição:
Publicada: 05/05/2021
About us
We are a company focused on agile technological solutions that simplify and generate real value in the day to day of organizations. We work to evolve your business quickly, consistently, and compatibly with the ever-accelerating changes of our time.
We believe that individuals and their interactions make your and our business. This is our way of doing it: with technology and simplicity.
Responsabilities:
Data preparation ( data ingestion/ transformation) for the other members of the team.
Must Haves:
- 3+ years experience with Hadoop, Spark (PySpark, Scala)
- 3 + years experience with Python for data engineering
- SQL
- Git development workflow
- DevOps/CI-CD tools (JIRA, GitLab, GitLab CI, NEXUS, SonarQube)
- AWS fundamentals (CloudFormation, IAM, Lambda, SQS, Athena, EMR)
- Unix
- AWS Architect Associate or AWS Big Data Certified
Nice to Haves:
- Experience with Airflow
- Docker
Habilidades
- AWS
- SQL
- Unix
Para cada desafio, uma solução.