Sr Data Engineer
Teradata
Lahore, Punjab PK
3d ago

Job Description

This role is responsible for the development and implementation of data warehouse solutions for company-wide application and managing large sets of structured and unstructured data.

The candidate will be expected to analyze complex customer requirements and work with data warehouse architect to gather, define and document data transformation rules and its implementation.

Data Engineers will be expected to have a broad understanding of the data acquisition and integration space and be able to weigh the pros and cons of different architectures and approaches.

You will have a chance to learn and work with multiple technologies and Thought Leaders in the domain.

Responsibility

  • Translate the business requirements into technical requirements
  • ETL development using native and 3rd party tools and utilities
  • Write and optimize complex SQL and shell scripts
  • Design and develop code, scripts, and data pipelines that leverage structured and unstructured data
  • Data ingestion pipelines and ETL processing, including low-latency data acquisition and stream processing
  • Design and develop processes / procedures for integration of data warehouse solutions in an operative IT environment.
  • Monitoring performance and advising any necessary configurations & infrastructure changes.
  • Create and maintain technical documentation that is required in supporting solutions.
  • Coordinate with customers to understand their requirements
  • Work with Teradata project managers to scope projects and develop a work break down structures and do risk analysis.
  • Lead a dynamic and collaborative team, demonstrating excellent interpersonal skills and management capabilities.
  • Readiness to travel to customer sites for short, medium or long-term engagements.
  • Skills and Qualifications

  • S. / M.S. in Computer Sciences or related field
  • Hands on experience with one or more ETL tools like Informatica, DataStage, Talend etc.
  • Strong concepts / experience of designing and developing ETL architectures.
  • Strong RDBMS concepts and SQL development skills
  • Strong Knowledge of data modeling and mapping
  • Experience with Data Integration from multiple data sources
  • Working experience in one or more business areas and industries : Telecom, Retail, Financial etc.
  • Good knowledge of Big Data technologies such as Pig, Hive, Spark, Kafka, Nifi
  • Experience with NoSQL databases, such as HBase, Cassandra, MongoDB
  • Experience with any of the Hadoop distributions such as Cloudera / Hortonworks
  • Experience of working with Cloud technologies
  • Training / Certification on any Hadoop distribution will be a plus.
  • Strong communication and analytical skills.
  • Proven experience in customer facing roles for large engagements and managing solution delivery teams.
  • Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form