About Me
I am an independent ML Researcher at ML Collective with a strong focus on Multilingual NLP, and ML Efficiency. My current interests revolve around making language technologies accessible to underserved communities, particularly in the Global South, who are often underrepresented and access to resources is limited. This shapes my projects, including the exploration of model compression techniques for LLMs, using quantization, pruning, and knowledge distillation, to ensure that these models are efficient and scalable.
My research also explores the intersection of multilinguality, data scarcity, and cultural inclusivity, with a focus (for now) on African languages and their sign languages. I have contributed to the creation of datasets like MasakhaNEWS, a benchmark for news topic classification across African languages, and investigated the effects of data augmentation techniques for machine translation in low-resource African languages. My current work includes the development of a multimodal, multilingual dataset and model benchmark for African languages, combining vision models with multilingual text decoders, as well as pioneering efforts in real-time sign language translation to bridge the communication gap for the hearing-impaired community in sub-Saharan Africa.
News
- I am currently applying to graduate schools for a graduate degree in CS!
- Our paper preprint "AfriCaption: Establishing a New Paradigm for Image Captioning in African Languages" is now on arxiv (August 2025)
- I joined SumUP as an accelerator engineer, working on AI Agents and RAG systems(March 2025)
- At the deep learning Indaba, I will be presenting a poster on my work "From Scarcity to Efficiency: Investigating the Effects of Data Augmentation on African Machine Translation"(September 2024)
- This September (2024), I’ll be attending Deep Learning Indaba — Africa’s largest research conference — in Senegal for the third year in a row.
- I started working on a new research project on Image Captioning for African Languages (July 2024)
- Our paper, What Happens When Small Is Made Smaller? Exploring the impact of Compression on Small Data Pretrained Language Models, was accepted into AfricaNLP, ICLR. (April 2024)
- I graduated from my second bachelor's degree (A CS degree!) in April, means a lot to me because I had to juggle being an ML engineer, an independent researcher and a student all at once.
- I am now based in London, England, and open to ML engineering roles and research internships.
- Our paper, MasakhaNEWS: News Topic Classification for African Languages, received the Area Chair Award under the resources and evaluation track at IJCNLP-AACL 2023.
- Our paper, MasakhaNEWS: News Topic Classification for African Languages, was accepted into IJCNLP-AACL. (Nov 2023)
- Won a Poster Award at Deep Learning Indaba 2023: Showcased my research work, “What Happens When Small Is Made Smaller,” at the 2023 edition of Deep Learning Indaba and won an award for this.
- I am always in need of compute for my research projects, so if you have some to spare, send them my way.
- Our paper, MasakhaNEWS: News Topic Classification for African Languages, won the Best Paper Award at the 2023 ICLR AfricaNLP workshop.
- Our paper, MasakhaNEWS: News Topic Classification for African Languages, was accepted into AfricaNLP, ICLR. (Apr 2023)
- My team was among one of the 10 teams that won the Ideathon Award at Deep Learning Indaba 2022, this came with mentorship from top researchers (Isabelle Guyon) and Engineers (Megumi Sano) at Google and compute from the CURE team.
- I was recognized as one of the Top 10 Female Data Scientists in Nigeria by Data Science Nigeria two years in a row (2020 & 2021).
