I’m just back from the TfL Access All Areas Accessible Transport Exhibition and I disucessed with some TfL people the issue of bus wheelchair space management.
There are two issues to overcome:
- How to detect if the wheelchair space is in use; and
- How to communicate this live information to passengers.
There are two ways to implement the space-occupied status:
- Use the on-bus CCTV cameras train an AI Machine Learning system to know when a Wheelchair user is on the bus, When this is working, provide a live feed of the status output (not the CCTV, just the status); or
- Use a physical Vehicle Detection Sensors to detect the space in use and provide this to public server.
Once the status from every TfL bus is being collected, it should be a very simple matter to add this to the data sent to the Countdown screens at bus-stops, perhaps adding in the traditional wheelchair symbol?
I learnt that TfL keep their bus CCTV recordings for 30 days, so there is a month’s worth of data that could be fed to some human and then an off the shelf Machine Learning system. Then there is an ongoing set of data to validate the trained system.
Perhaps Google Cloud Platform could offer some resources?