Checking job availability...
Original
Simplified
Job Description
Key job responsibilities
Key job responsibilities
- Design, implement, and support a platform providing secured access to large datasets.
- Develop and maintain end to end scalable data infrastructure and data pipelines
- Design and implement routines to extract, transform, and load data from a wide variety of data sources using Python, SQL, scripting, and AWS big data technologies.
- Design and implement complex ETL pipelines and other BI solutions.
- Work closely with product owners, developers, and data strategists to explore new data sources and deliver the data.
- At least 3- 5 years of hands-on experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions on AWS Cloud Platform
- AWS data services experience in particular AWS Glue, Lambda, RDS PostgreSQL, S3, Redshift, Athena, and other data infrastructure automation services
- Experience with scripting in a serverless architecture using Python.
- Experience with Data modeling, querying, and optimization for relational, NoSQL, data warehouses and data lakes.
Job Description
Key job responsibilities
Key job responsibilities
- Design, implement, and support a platform providing secured access to large datasets.
- Develop and maintain end to end scalable data infrastructure and data pipelines
- Design and implement routines to extract, transform, and load data from a wide variety of data sources using Python, SQL, scripting, and AWS big data technologies.
- Design and implement complex ETL pipelines and other BI solutions.
- Work closely with product owners, developers, and data strategists to explore new data sources and deliver the data.
- At least 3- 5 years of hands-on experience in developing data ingestion, data processing and analytical pipelines for big data, relational databases, NoSQL and data warehouse solutions on AWS Cloud Platform
- AWS data services experience in particular AWS Glue, Lambda, RDS PostgreSQL, S3, Redshift, Athena, and other data infrastructure automation services
- Experience with scripting in a serverless architecture using Python.
- Experience with Data modeling, querying, and optimization for relational, NoSQL, data warehouses and data lakes.