Position – Data Analyst
Location – BGC Taguig
Job Type – Contract 6 Months Extendable and renewable with LanceSoft
Work Mode - Hybrid Setup (2-3x a week onsite) 3 day office 2 days wfh
What you'll focus on:
Develop ETL jobs.
Enhance an ETL job applying best practices and the spark configuration balancing.
Conversion and migration of jobs from one platform to another big data platform.
Integration of a job from its ingested source transforming it into various targets like databases, extracts files, and use cases of a streaming platform.
Keep technology trends about big data ETL as much as possible.
Troubleshooting, testing, and pre-implementing ETL’s data mapping.
Generate reports, create dashboards for analysis.
Reports rationalization (including data mapping, identification of data sources, and alignment on what table sources to use).
Business user discussion on the proposed rationalized dashboards.
Improve efficiency and productivity through process simplification, automation, and democratization of data.
Promo post-mortem analyses to determine the success of products launched.
Support the big data issues encountered on enterprise data systems and frameworks.
Ideal skills:
Programming languages like: Java 11 or higher (for ingestion), SQL, Python
Reporting tools like Tableau and Looker
Great to have: Dbt, Databricks, Kafka, Airflow, NiFi, Linux Administration, Bitbucket
Position – Data Analyst
Location – BGC Taguig
Job Type – Contract 6 Months Extendable and renewable with LanceSoft
Work Mode - Hybrid Setup (2-3x a week onsite) 3 day office 2 days wfh
What you'll focus on:
Develop ETL jobs.
Enhance an ETL job applying best practices and the spark configuration balancing.
Conversion and migration of jobs from one platform to another big data platform.
Integration of a job from its ingested source transforming it into various targets like databases, extracts files, and use cases of a streaming platform.
Keep technology trends about big data ETL as much as possible.
Troubleshooting, testing, and pre-implementing ETL’s data mapping.
Generate reports, create dashboards for analysis.
Reports rationalization (including data mapping, identification of data sources, and alignment on what table sources to use).
Business user discussion on the proposed rationalized dashboards.
Improve efficiency and productivity through process simplification, automation, and democratization of data.
Promo post-mortem analyses to determine the success of products launched.
Support the big data issues encountered on enterprise data systems and frameworks.
Ideal skills:
Programming languages like: Java 11 or higher (for ingestion), SQL, Python
Reporting tools like Tableau and Looker
Great to have: Dbt, Databricks, Kafka, Airflow, NiFi, Linux Administration, Bitbucket