As a Data Engineer, you should be comfortable around ETL techniques, typically applied to Cloud infrastructures, and involving large volumes of data. Those could be applied by batches or streams. You will also be overseeing the database schemas, lifecycles, permissions, availability, and interconnections. We also expect you to be comfortable with data analysis and modeling techniques (aggregations, clustering, etc.).
Job Requirements
- * Degree in Computer Science or relevant field
- * At least 3 years or proven experience as a Data Engineer
- * Knowledge of data engineering languages and libraries: Python, Dataflow, Beam, Kafka, Hadoop, Kubernetes, …
- * Knowledge of data engineering automation tools: Cloud Orchestration, Scheduler, Glue, Airflow, …
- * Knowledge of data analysis tools and libraries: Numpy, Panda, Scikit (Kmeans, simple classifiers, …), Plotly, Seaborn, Jupyter
- * Knowledge of Cloud SQL, BigQuery, Datastore and NoSQL frameworks
- * Experience with Cloud infrastructures (GCP, AWS, Azure, Cloudera, DO)
- * Excellent communication and teamwork skills
- * Ownership and independance
- * An analytical mind
- * Leadership and organizational skills are a plus
Job Description
- * Develop, construct, test and maintain cloud architectures to nurture data-focused products and internal tools
- * Align said architecture with business requirements
- * Acquire and integrate data from external sources
- * Design and implement lifecycle and retention policies
- * Implement analytics models
- * Manage data access and protection measures
- * Write technical documentation
Apply Now