Job reference: XTBGSE01
Bulgaria | Based out of Sofia
Posted March, 2022
The Financial Times is one of the world’s leading business news organisations, recognised internationally for its authority, integrity and accuracy. The FT has a record paying readership of one million, three-quarters of which are digital subscriptions. It is part of Nikkei Inc., which provides a broad range of information, news and services for the global business community. This is an exciting opportunity for a Software Engineer - Data Platform to join as part of the Product & Technology organisation within the Financial Times.
FT Core brings together key digital assets like:
- Content and Metadata - enabling content publishing, discoverability and targeting
- Membership - powering a seamless subscription/payments/access and the whole
newspaper distribution customer journeys
- Data Platform - enabling storing, processing and getting insights from readership data.
You will have the autonomy to select the tools and technologies you need to build and operate services responsible for FT brand critical capabilities. Someone who is comfortable with the ever changing technical landscape and is keen to contribute to the company’s processes and broader know-how would thrive in this role.
Working in the Data Platform team you will be responsible for delivering innovative technical solutions as you build and operate world class platforms to grow the Financial Times strategic business models. To help you understand in more details the Financial Times’ Data Platform team’s goals and projects take a look at this article.
We have recently published our Engineering Progression framework and associated competencies for a Software Engineer relevant to this position
The tech stack
We often use these, it's not an exhaustive list but gives you a taste of what our technology stack and tools look like:
- Workflow orchestration tools like Airflow, Luigi or Oozie
- AWS: ECS/EKS, Kinesis/MSK (Kafka), Redshift
- Streaming technologies like Kafka, Spark or Flink
- Distributed SQL Engines like Presto, Athena or Impala
- GitHub, CircleCI
- Graphite, Grafana, Splunk
The hiring process
We understand that tech interviews are often stressful for no good reason, so we designed our interview process to be rigorous but friendly. We have tech tests requiring minimal code writing and algorithms understanding, a friendly system design exercise and a set of questions to encourage you to tell us about your strengths and what makes you happy at work.
We're committed to equality and diversity in the tech industry, so we'll be especially happy to see applications from under-represented backgrounds. We encourage this by considering flexible working hours as well as tuning the hiring process to promote diversity.
This role will be right for you if you are happy to:
- Work directly with product owners and more senior engineers to fully shape solutions from
inception to deployment and beyond
- Work within a team of engineers in an agile delivery team
- Development of high quality data solutions within Financial Times’ Data Platform
- Design and implement low maintenance, well monitored, secure and scalable solutions to
- Design, build and operate solutions, from cradle to grave, which meet both functional and
- Understand and play an active part in designing the architecture, tooling and release cycle
processes used by the engineering teams across Product & Technology
- Contribute to company-wide processes, frameworks and guidelines
- Develop an in-depth understanding of FT’s underlying data and data flows, data structures
- Develop a close relationship with our customers and provide operational support.
The ideal candidate profile is:
- Good command of written and spoken English
- Extensive development experience in Python, including development of microservices
- Extensive experience in working with Databases
- Experience in containerization technologies such as Docker or K8s
- Knowledge of design patterns - TDD, OOP
- Experience working as part of an Agile delivery team, using methodologies like Scrum and
- Willingness and curiosity to learn and use new technologies and programming languages not familiar with.
- Experience working with AWS Cloud services in the Big Data domain
- Experience productionizing Machine learning algorithms or Data science models
- Experience with streaming applications such as Kafka streams, Spark streaming
- Experience with Release procedures with Jenkins, Circle CI
- Experience in working with ETL frameworks (job orchestration tools) such as Airflow or Luigi
We can't wait to meet you!