hockey, cycling, tennis and kitesurfing
At Daphne, Sacha conducted a project aiming to investigate the potential of the data captured by the representatives of Daphne to further improve internal processes. Moreover, he built a dashboarding application using Python and Vue.js to make the existing data more readily available within the company.
Sacha joined its data science team to cover for recent leavers while VOO was in the middle of a migration to the cloud. His role was to monitor a number of data science jobs running daily and to further improve the monitoring tools. On top of this, he was in touch with various departements to handle data analysis requests.
During his time as a PhD-student, Sacha developed his data science skills and investigated a number of topics among which big data analytics, computer vision, natural language processing and neural networks. Furthermore, he was a teaching assistant for the course “Big Data” where he lectured the coding sessions in Python and Apache Spark. For this course, he designed and evaluated the student assessments. Lastly, he supported master students during their master thesis.
Sacha designed and implemented this project end-to-end using Power BI and Azure Cloud
Sacha was responsible for the set-up of Scheduling the AI scripts using Apache Airflow. Sacha designed and configured a best-practice set-up running Apache Airflow in a docker on Azure Cloud.
In this project, Sacha's role was supporting Data Engineering and building the first personalization Data Science use-cases. Core technologies used were Pyspark, Databricks, Airflow, Kubernetes and the overall Azure Cloud PaaS components (incl. Azure Data Factory, Azure Batch, Blob Storage, Event Hubs and Azure Functions).
Sacha is supporting the internal data team in optimizing their data platform (resource and cost efficiency). In the next phase, Sacha will support in building predictive models to support the business needs of Renson.
Sacha helped setting up the required components to capture sensordata from their process temperature solutions. He used components in the Azure Cloud (IoT Hub, Data Lake).
Sacha supported Proximus in their search for a next-generation machine learning tool. Within this role, he was in touch with the various data scientist teams within Proximus to capture their specific needs and translate this to an RFP. He furthermore assisted the team with a proof-of-concept using the selected tool. In this role, his focus lies on translating the needs of the data science team into a workable solution, keeping in mind the multitude of stakeholders at the organization.
Sacha assisted the team in shaping the business requirements and translating these into a functional architecture. Furthermore, he supported & coached the internal team in the adoption of Azure tools such as Data Factory, Databricks, DevOps and Data Lake.
Sacha was responsible for setting up an Azure environment according to best practices. Next to delivery, part of the exercise was knowledge transfer to the internal team.
Sacha has taken up the role of Senior Data Scientist supporting in the strategy of the project, defining the right direction and evaluating the work prepared by a Junior Data Scientist. Sacha was responsible for all end-communication and stakeholder management with the customer.