hero

Craft Ventures Portfolio Job Board

Senior Data Acquisition Engineer - Python, AWS

Terminal

Terminal

Software Engineering
Colombia
Posted on Jul 29, 2025
About Daloopa

Use the most complete and accurate set of public company historicals to build and update your models in your preferred method and format with Daloopa. Key data is added to your models instantly. Faster data powers faster and better insights. Are you ready to gain your edge with Daloopa?

About The Role

Daloopa’s mission is to become the market leader in high-quality, actionable data for the world’s top investment professionals. As a Senior Python Engineer on our Crawler Team, you will play a pivotal role in acquiring and transforming the raw data that serves as the backbone of every Daloopa product and decision. Our data acquisition systems are critical to our ability to provide unique, up-to-date, and reliable datasets for our clients. In this role, you will help design, develop, and sustain the advanced infrastructure that brings vast, diverse data sources into the Daloopa ecosystem. Your work directly enables investment research, automation, and strategic insights across our organization and client base.

What You’ll Do

  • Architect and implement systems to acquire large volumes of structured and unstructured data from a wide array of sources, ensuring completeness and quality.
  • Collaborate with product, engineering, and operations teams to understand evolving data requirements and turn them into robust, automated data collection solutions.
  • Enhance and optimize data ingestion pipelines to support timely delivery of critical datasets for business and client needs.
  • Monitor changing data landscapes and be proactive in sourcing new types of data to maintain Daloopa’s competitive edge.
  • Address challenges related to scale, data freshness, and reliability to ensure data pipelines drive business value.
  • Mentor peers on best practices in data acquisition and data engineering.

What You’ll Bring

  • Demonstrated success building, maintaining, and evolving data pipelines at scale, such as architecting ETL workflows that routinely process millions of records per day or integrating data from dozens of disparate sources into production.
  • Ownership of end-to-end data acquisition workflows, with direct experience managing and improving data quality metrics (for example, reducing error rates or increasing completeness by specific percentages).
  • A proven track record designing resilient and efficient systems, able to adapt quickly to new requirements, handle complex edge cases, and maintain data integrity and completeness at all times.
  • Hands-on expertise with Python and frameworks like Scrapy (or equivalents), creating automated processes for data ingestion, cleaning, and enrichment.
  • Skilled at cross-functional collaboration, driving ambitious data goals and iterating rapidly with engineering, data, product teams, and other internal stakeholders.
  • Countless stories optimizing, troubleshooting, and scaling data infrastructure for reliability, freshness, and performance. Speaking to practical expertise, not just theory.
  • Known for levelling up those around you: you share knowledge freely, break down silos, and elevate team standards through clear documentation, communication, and mentorship.