Data Engineer
Nowasys Services
Lahore, PB, pk
1d ago

Job Description

As a Data Engineer, you will be responsible for developing best practices and making architectural decisions to rapidly improve critical data processing & analytics pipelines.

You will tackle hard problems to improve the platform’s reliability, resiliency, and scalability. We are looking for someone who thrives on autonomy and has experience driving long-term projects to completion.

You are detail and quality oriented, and excited about the prospects of having a big impact with data. Our tech stack includes Airflow, EMR, PySpark , and various AWS services.

Job type : On-Site

Location : Johar Town, Lahore

Timings : Standard

Salary : Market Compatible

Responsibilities :

Must Hands on Experience in Aws Services (EMR, S3, Redshift, Lamda) PySpark, Apache Airflow, Python

Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.

Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

Writes unit / integration tests, contributes to engineering wiki, and documents work.

Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

Works closely with a team of frontend and backend engineers, product managers, and analysts.

Designs data integrations and data quality framework.

Designs and evaluates open source and vendor tools for data lineage.

Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.

Requirements

Responsibilities : Must Hands on Experience in Aws Services (EMR, S3, Redshift, Lamda) PySpark, Apache Airflow, Python Develops and maintains scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity.

Collaborates with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.

Implements processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.

Writes unit / integration tests, contributes to engineering wiki, and documents work. Performs data analysis required to troubleshoot data related issues and assist in the resolution of data issues.

Works closely with a team of frontend and backend engineers, product managers, and analysts. Designs data integrations and data quality framework.

Designs and evaluates open source and vendor tools for data lineage. Works closely with all business units and engineering teams to develop strategy for long term data platform architecture.

Report this job
checkmark

Thank you for reporting this job!

Your feedback will help us improve the quality of our services.

Apply
My Email
By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
Continue
Application form