Data Scientist
Posted on Sept. 1, 2025 by TP ICAP
- Robertsganj, Singapore
- N/A
- Full Time

Our purpose is to provide clients with access to global financial and commodities markets, improving price discovery, liquidity, and distribution of data, through responsible and innovative solutions.
Through our people and technology, we connect clients to superior liquidity and data solutions.
The Group is home to a stable of premium brands. Collectively, TP ICAP is the largest interdealer broker in the world by revenue, the number one Energy & Commodities broker in the world, the world’s leading provider of OTC data, and an award winning all-to-all trading platform.
Founded in London in 1866, the Group operates from more than 60 offices in 27 countries. We are 5,200 people strong. We work as one to achieve our vision of being the world’s most trusted, innovative, liquidity and data solutions specialist.
Parameta Solutions Overview
Parameta Solutions is the Data & Analytics division of TP ICAP Group. The business provides clients with unbiased OTC content and proprietary data, in-depth insights across price discovery, risk management, benchmark and indices, and pre and post-trade analytics. Its post-trade solutions offering helps market participants control their counterparty and regulatory risks through a growing range of tools that manage balance-sheet exposure, as well as compression and optimisation services. The Data & Analytics division includes the following brands: Tullett Prebon Information, PVM Data Services, ICAP Information and Burton-Taylor Consulting.
Role Overview
This is an exciting opportunity to work with cutting-edge technologies and complex financial datasets, applying statistical methods, data science techniques, and modern software engineering practices to deliver impactful insights and innovative data products. Our group operates with a start-up mindset, giving you the freedom to explore new tools, adopt best practices, and help shape the future of Parameta Solutions’ data strategy.
Our tech stack includes Python, Snowflake, Jupyter, GitLab, Docker, Airflow, AWS, and GCP.
Key Responsibilities
- Build scalable Python applications to query, analyze, and combine large, complex financial datasets using statistical and computational techniques.
- Develop robust quality control and monitoring tools to ensure data accuracy, integrity, and consistency across products and workflows.
- Create insightful dashboards and visualizations to track application performance, monitor key metrics, and communicate results to technical and non-technical stakeholders.
- Support the design, implementation, and deployment of machine learning models and predictive analytics to enhance data-driven decision-making.
- Partner with data engineering, product, and commercial teams to deliver high-quality, scalable applications and solutions aligned with business needs.
- Enhance and maintain existing applications, applying best practices in software development, testing, and documentation.
- Stay current with industry trends in data science, cloud computing, and financial market analytics; proactively recommend new technologies or methodologies to drive efficiency and innovation.
- Contribute to workflow automation using tools such as Airflow and Docker to streamline data pipelines and product delivery.
Experience & Competencies
- Advanced degree (Master’s or PhD) in Data Science, Mathematics, Engineering, Computer Science, or related field.
- Hands-on professional experience in a numerical or computational role, preferably within financial services or technology.
- Demonstrated proficiency in Python, with emphasis on NumPy, Pandas, SciPy, and best practices in OOP.
- Proficiency in SQL for querying and managing large datasets.
- Experience working with large, complex, and structured/unstructured datasets.
- Demonstrable problem-solving and analytical skills, with the ability to derive actionable insights from data.
- Exposure to cloud technologies (AWS or GCP).
- Exceptional collaboration and communication skills, with experience working cross-functionally across engineering, product, and commercial teams.
Desired
- Experience with financial market data, trading workflows, or risk/analytics use cases.
- Familiarity with Snowflake or similar modern data warehouse technologies.
- Experience with Airflow (workflow orchestration) and Docker (containerization).
- Knowledge of CI/CD pipelines (GitLab or similar).
- Hands-on experience with machine learning models (e.g., scikit-learn, TensorFlow, PyTorch).
- Experience developing interactive dashboards (e.g., Plotly Dash, Streamlit, Power BI, Tableau).
- Knowledge of data architecture, reference data, or data pipelines in a financial services context.
- Interest in staying ahead of emerging trends in AI, data science, and financial technology.
Band & Level
- Professional / 5
#PARAMETA #LI-ASO #LI-Hybrid
Advertised until:
Oct. 1, 2025
Are you Qualified for this Role?
Click Here to Tailor Your Resume to Match this Job
Share with Friends!
Similar Internships
Expert Data Scientist
We are a consulting company with a bunch of technology-interested and happy people! We love techno…
Data Scientist, Global Operations Intelligence, SMAI
Our vision is to transform how the world uses information to enrich life for all. Join an inclusive…
Senior Data Scientist
Luton, GBR Rijswijk, NL; Hannover, DE; Zaventem, BE; Stockholm -Arlanda, SE; Hybrid Permanent Full …
Data Scientist II
Security represents the most critical priorities for our customers in a world awash in digital thre…
Data Scientist
Value Proposition: Our values define us and our culture inspires us to change lives for the better…
Data Scientist, Reinforcement Learning
Binance is a leading global blockchain ecosystem behind the world’s largest cryptocurrency ex…