Experience in Big Data / Data Engineering
Minimum 3 years experience in a technical role, preferably in the data or cloud field.
Knowledge of at least one of the public cloud platforms AWS, Azure, or GCP is required.
Knowledge of a programming language - Python, Scala, or SQL
Knowledge & Experience of Databricks or Snowflake(atleast 1 mandatory)
Knowledge of end-to-end data analytics workflow
Hands-on professional or academic experience in one or more of the following:
Data Engineering technologies (e.g., ETL, DBT, Apache Spark™ Airflow)
Data Warehousing technologies (e.g., SQL, Stored Procedures, Redshift, Snowflake)
Excellent time management and prioritization skills
Excellent written and verbal communication
Bonus - Knowledge of Data Science and Machine Learning (e.g., build and deploy ML Models)