Work Experience

Full time position.  Working on Federated Learning and decentralized ML.

Part of Industrial PhD agreement (Convention CIFRE) with INRIA

Working in multiple application contexts. (e.g. NLP,  CV,  Optimization, etc.)

2024 - 2027


Relai de these contract , Full time position - Research in Decentralized Machine Learning and Federated learning context

January - March 2024


Member of LINCS Laboratory at Telecom Paris. Part of the PhD studies @ NOKIA.

January - March 2024

End of studies internship for master's degree in Artificial Intelligence.

Worked in the AI & CV team at L'Oréal R&I. Collaborated with colleagues from digital optics and AI teams.

Developed a Generative Adversarial Network and a diffusion model for conditional image generation.

Implementation for face image transformation, controlling face skin tone and light direction of image in continuous space.

Achieved state-of-the-art FID metric of 2.1 and a CFID 2.3 for image generation in scalable high resolution images.

March - August, 2023



Generated samples on the skin tone spectrum from left to right while maintaining the same person.

Generated samples on the light direction spectrum. From left to right, the light direction is changed from left to right. The same person is maintained.

Training on research work that correlated with my master's studies. Supervised by Isabelle Guyon.

Worked as a mentor/team member of HumanIA group on creating a new stylized version of the meta album dataset "previously produced by the group". 

Utilized Neural Style transfer and investigated the effect of bias in image datasets on shortcut learning in deep neural networks.

Working with a PhD student and my professor on Differentiable Neural Architecture Search.

Conducting literature review on DNAS methods and conducting comparative analysis.

Presenting a report that was later integrated to a paper under review from ACM Computing Surveys.

June - August, 2022

Differentiable Neural Architecture Search is a bi-level optimization problem where we optimize the model weights on the training set and we optimize the model architecture on a separate validation set. It offers the most efficient approach for optimizing Neural Nets architectures in comparison to Reinforcement learning methods and others. The main highlight here is that the optimization approach for the hyperparameters that define the NN architecture is a gradient-based approach. Thus, discreate parameters such as number of filters, layers, etc. are relaxed in a continuous pace and the gradient is approximated in that space. Finally, when sampling from the continuous space, the variables are discretized so the model is created to be trained on training set.

Administering Sumo robotics competitions and line racing robots in Syria.

Evaluating programming and problem solving for college teams in Sumo challenge.

March - September, 2020

Supervised two teams of freshmen students that participated in the open and senior categories of the World Robot Olympiad in Syria

Development of a smart governance system for autonomous robots  in modern cities.

Directly worked on the robot's design, software + Implementation of an HTTP-based IOT system for mobile robots.