GX has built specialised knowledge AI assistants for the banking and insurance industry. Our assistants are fed by sector-specific data and knowledge and easily adaptable through ontology layers to reflect institution-specific rules.
GX ai assistants are designed for Individual Investors, Credit and Claims professionals. Our assistants are being used right now in global financial institutions. Proven, trusted, non-hallucinating, our assistants are empowering financial professionals and delivering 10x improvements by supporting them in their day-to-day tasks.
Role overview:
As a Sr. Data Engineer, you’ll drive the design, development, and optimization of our data engineering pipelines and services with a strong software engineering mindset.
You'll also play a pivotal role in fostering growth and collaboration within the team, mentoring junior colleagues, and refining processes to help us maintain excellence.
If you’re excited to make a lasting impact both technically and as a team leader, we’d love to meet you.
Desired skills:
We are looking for candidates with a strong educational background in Mathematics, Computer Science, Engineering, Physics, or a related field, along with 6+ years of proven experience in Data Engineering, including data warehousing, ETL, automation, cloud technologies, or Software Engineering in datarelated areas.
Every day, you will rely on these hard skills, so if you haven’t already mastered them yet, we expect you to become confident quickly:
Strong software engineering principles to write maintainable, scalable code, primarily in Python (experience with Rust or C# is a plus).
Data processing and manipulation using Polars and Pandas, scaled via Apache Ray.
Efficient use of various open-source data storage technologies such as S3, Apache Iceberg, DuckDB, Clickhouse, PostgreSQL, and others.
Containerization and orchestration tools like Docker and Kubernetes, with automated deployments in cloud environments (AWS, Azure, GCP).
Complex workflow orchestration using open-source tools such as Apache Airflow
Big data and distributed processing technologies, including Hadoop, Spark, and Kafka.
On the soft skills side, you will need:
Experience leading and mentoring junior team members is an advantage, fostering skill development, collaboration, and process improvements.
Cross-functional collaboration experience, working with software engineers, data scientists, and ML engineers to deliver high-impact solutions.
Why you don’t want to miss this career opportunity:
Be part of a groundbreaking AI project: Work on the world’s first personalized AI assistant for Financial Institutions, leveraging the latest advancements in Large Language Models (LLMs) to deliver unique data insights.
Work with cutting-edge technology: If you like working with innovative tools, you'll appreciate our tech stack, including Apache Ray, Polars, Apache Iceberg, Clickhouse, DuckDB, and more – all used to push the limits of data engineering.
Stay at the forefront of AI and data engineering: Join a team dedicated to staying on the leading edge of technology, providing you the opportunity to continuously learn and apply new skills in AIdriven data solutions.
Contribute to a collaborative, engineering-centered environment where leadership and learning go hand-in-hand.
Mentor and work alongside industry leaders and Fortune 100 clients on mission-critical projects.
Grow with a diverse Prague team that values professional development, with opportunities for formal training and informal knowledge sharing.
Enjoy a hybrid work environment with in-person team gatherings, social events, and office perks like coffee, tea, and beer on Thursdays.
Interested in learning more about what we’re building in Prague? Don’t hesitate to apply – we look forward to connecting with you!
Apply Now
Failure
There was an error processing your job application. Please try after some time.
Success
Thank you for applying for the position. Your application has been successfully submitted in our records. We will reach out to you with next steps. Your reference number is
Do you consent to have cookies stored on our device and sent to our servers for analytic purposes? (Cookie policy)