Senior Data Pipeline Engineer - DataHub
The DataHub Engineering team provides a distributed platform for hosting datasets, complete with managed data stores, search, discovery, batch analytics and real-time stream processing capabilities. Our goals: ensure high quality content, which is indispensable to financial markets, is cataloged, standardized, discoverable, distributed and accessible in one place.
Who are you:
The ideal candidate is an innovative problem solver who enjoys working in multiple roles and thrives in a fast-paced, collaborative environment. You are curious, kind, continually learning, and happy to share what you learn. You enjoy pursuing complex issues to their root cause. You love distributed messaging, performance at scale, engineering for reliability.
What's in it for you:
As a senior data pipeline engineer in the DataHub team, you will build a Kafka-based data pipeline infrastructure that scales to address needs of all financial datasets at Bloomberg. You will engineer for reliability, scale, performance, efficiency, observability and ease of use. This is an opportunity to engineer systems on a massive scale, and to gain valuable experience in distributed computing. You'll be surrounded by people who are passionate about distributed computing, and believe that world-class service is critical to customer success. You'll get the chance to work with engineering teams across Bloomberg and understand their application requirements and build systems together.
You'll need to have:
We'd love to see:
If this sounds like you, apply!
Bloomberg is an equal opportunities employer, and we value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.
Mehr Jobangebote von Bloomberg L.P.
Weitere Angebote im Stellenmarkt