FlexHired LogoFlexHired

Job Summary

The role involves designing and optimizing scalable data pipelines using tools like Apache Spark, Kafka, and orchestration frameworks such as Prefect or Airflow. The candidate will work with large datasets and external APIs to support NLP models, collaborating with data scientists and ML engineers. Responsibilities include end-to-end data engineering, performance analysis, and client communication. The position is part-time, short-term, and remote within the US.

Required Skills

Data Engineering
Data Integration
Python
Data Analysis
Communication
Natural Language Processing
NLP
Data Transformation
Workflow Orchestration
Apache Spark
Project Ownership
Apache Kafka
Real-time Data Processing
ETL/ELT Pipelines
Large-scale Data Processing

Benefits

Remote Work

Job Description

Do you want to work on cutting-edge projects with the world’s best IT engineers? Do you wish you could control which projects to work on and choose your own pay rate? Are you interested in the future of work and how the cloud will form teams? If so - the Gigster Talent Network is for you.

Our clients rely on our Network for two main areas, Software Development and Cloud Services. In some cases, they need help building great new products, in others they want our expertise in migrating, maintaining, and optimizing their cloud solutions.

At Gigster, whether working with entrepreneurs to realize ‘the next great vision’ or with Fortune 500 companies to deliver a big product launch, we build really cool enterprise software on cutting-edge technology.

The Role:

We are seeking an experienced Data Engineer with deep expertise in data transformation at scale, particularly in integrating and processing data from third-party public APIs. This role is critical to enhancing and maintaining data pipelines that feed into Natural Language Processing (NLP) models.

What you’ll do:

  • Design, build, and optimize scalable ETL/ELT data pipelines using Apache Spark, Apache Kafka, and orchestration tools such as Prefect or Airflow

  • Integrate external data sources and public APIs with internal data systems

  • Work with large-scale datasets to support NLP model training and inference

  • Analyze existing pipelines and recommend enhancements for performance, reliability, and scalability

  • Collaborate with cross-functional teams, including data scientists and ML engineers

  • Own the end-to-end engineering process—from planning and technical design to implementation

  • Regularly report progress and outcomes to client stakeholders

What we’re looking for:

  • Proficiency in Python and experience with data transformation and data engineering best practices

  • Strong experience with Apache Spark, Apache Kafka, and Google Cloud Platform (GCP)

  • Hands-on experience with workflow orchestration tools (e.g., Prefect, Airflow)

  • Demonstrated experience working with large datasets and real-time data processing

  • Experience building and maintaining ETL/ELT pipelines for analytical or machine learning use cases

  • Self-motivated, with excellent communication and project ownership skills

Preferred Qualifications:

  • Familiarity with financial services data or regulated data environments

  • Experience with Snowflake or Google BigQuery

  • Experience with PostgreSQL and GCS (Google Cloud Storage)
  • Exposure to NLP workflows and data requirements for machine learning models

Logistics:

  • This is a part-time, short term, 4 to 6 weeks contract
  • Preferred location: Remote US

Interested in this job?

Application deadline: Open until filled

Logo of Gigster

Gigster

Gigster connects businesses with top-tier talent for custom software development, AI advancements & digital experiences. Get started with on-demand experts.

See more jobs
Date PostedMay 16th, 2025
Job TypePart Time
LocationRemote | US
SalaryCompetitive rates
Exciting remote opportunity (requires residency in United States) for a Data Engineer at Gigster. Offering competitive salary (part time). Explore more remote jobs on FlexHired!

Safe Remote Job Search Tips

Verify Employer Thoroughly

Research the company's identity thoroughly before applying. Check for a professional website with contacts, active social media, and LinkedIn profiles. Verify details across platforms and look for reviews on Glassdoor or Trustpilot to confirm legitimacy.

Never Pay to Get a Job

Legitimate employers never require payment for applications, training, background checks, or equipment. Always reject upfront payment requests or demands for bank details, even if they claim it's for purchasing necessary work gear on your behalf.

Safeguard Your Personal Information

Protect sensitive data like SSN, bank details, or ID copies. Share this only after accepting a formal, written job offer. Ensure it's submitted via a secure company system or portal, never through insecure channels like standard email attachments.

Scrutinize Communication & Interviews

Watch for communication red flags: poor grammar, generic emails (@gmail), vague details, or undue pressure. Be highly suspicious of interviews held only via text or chat apps; legitimate companies typically use video or phone calls.

Beware of Unrealistic Offers

If an offer's salary or benefits seem unrealistically high for the work involved, be cautious. Research standard pay for similar roles. Offers that appear 'too good to be true' are often scams designed to lure you into providing information or payment.

Insist on a Formal Contract

Always secure and review a formal, written job offer or employment contract before starting work or sharing final personal details. Ensure it clearly defines your role, compensation, key terms, and conditions to avoid misunderstandings or scams.

Subscribe Newsletter

Never miss a remote job opportunity. Subscribe to our newsletter today and receive exclusive job alerts, career advice, and industry insights delivered straight to your inbox.