Skip to content
View Strange-Philip's full-sized avatar

Highlights

  • Pro

Block or report Strange-Philip

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don’t include any personal information such as legal names or email addresses. Markdown is supported. This note will only be visible to you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Strange-Philip/README.md

Philip Abakah

AI Engineer · Optometrist · Builder of Real-World Systems

I didn’t come into this from computer science in the traditional way. I started early — building mobile applications in high school, long before I formally stepped into AI. That foundation in software engineering, especially in mobile development, shaped how I think about building systems: practical, user-focused, and designed to actually be used.

I later studied optometry 👨🏾‍⚕️ and vision science, perception, how people interact with the world — and somewhere along the way, I started asking a different question:

what if we could build systems that don’t just see… but actually help people navigate reality?

So I kept building, just in a different direction.

My work sits at the intersection of AI, healthcare, and accessibility — not as ideas, but as systems that run in the real world.

One of the most defining projects I’ve built is Safe Step, my final year thesis. It’s an AI-powered navigation system for visually impaired users, built around a real-time computer vision pipeline using YOLOv8. But the model was never the point. The real challenge was translating perception into action, turning noisy visual data into simple, usable guidance. I designed it as a decision-support system, not just a detection model, and evaluated it with human participants. Under simulated blindness, users went from 0% task completion to 100% with the system.

I’ve also built EyeDxAi, an AI-assisted diagnostic tool that combines image understanding with structured reasoning to suggest possible eye conditions and next steps. That project pushed me into thinking about how models can support real decision-making, not just predictions.

A major turning point for me was joining Envision Technologies.

Working on systems used by over 100,000 blind and low vision users changed how I think about building software. It wasn’t about prototypes anymore. It was about reliability, usability, and designing systems that people depend on.

It was the first time I saw something I worked on exist outside of me — in the hands of real users, in real environments, solving real problems.

That changed everything.

What I Work With

I primarily work with Python, Dart, and JavaScript, building systems that span from machine learning pipelines to fully deployed mobile applications. My development process is centered around frameworks like Flutter for cross-platform mobile systems and TensorFlow and YOLO for real-time computer vision tasks. I rely on tools such as Git for version control, REST APIs for system integration, and Figma to think through user experience and interface design before implementation.

More broadly, my work focuses on computer vision and assistive technology, particularly in building mobile systems that go beyond prediction and function as decision-support tools. I’m especially interested in designing human-centered AI systems — systems that not only interpret data, but translate it into clear, usable actions for people in real-world environments.

Computer Vision AI/ML Mobile Systems Assistive Tech Human-Centered AI

Direction

I’m interested in systems that sit between:

  • perception → understanding → decision → action

Especially in:

  • assistive navigation
  • real-time vision systems
  • healthcare decision support

Outside of my core work, I enjoy following Formula One 🏎️ and occasionally exploring telemetry data to build small models and better understand performance dynamics. It’s a space where my interest in real-time systems, data, and decision-making naturally extends beyond healthcare into another high-performance domain.

I’m not interested in just building models.
I’m interested in what happens after the model runs.

Pinned Loading

  1. vodafone_redesign vodafone_redesign Public

    Dart 19 2

  2. simple_floating_bottom_nav_bar simple_floating_bottom_nav_bar Public

    My first flutter package : a simple floating bottom navigation bar... use it and let me know what you think🙂

    C++ 3 1

  3. tiny_notes tiny_notes Public

    Simple note taking app by Philip

    Dart 4

  4. slightly_notie slightly_notie Public

    My entry for Slightly techie mobile app challenge

    Dart

  5. iRis iRis Public

    A simple educational app which contains notes taking,flash cards,quizes and educational resourse materials for Ocular Physiology.

    HTML 2

  6. Swift_Theme_App Swift_Theme_App Public

    first little project for swift ui

    Swift 3