We will have more information on this for you soon after the initial testing and feedback is over. When on a menu or battle screen, the service will read out the text on the screen and the currently selected menu option. When selected, the AI service will unpause the game and move the player to that thing and interact with it. If the player holds the select button at this time, then the AI service will read out the list of things of interest on the screen and allow the player to scroll through them and select one. When the emulator is paused, it will give a more detailed description of what’s on the screen, including how far the player can walk in all directions and all things of interest along with their coordinates relative to the player. When in a town or overworld view, it will describe what’s around the player to the west, north, east, and south, as well as any new things of interest that have appeared on screen (eg: a townsperson, a weapon shop, treasure chest, etc.). When started, the AI service will continually parse the screen and describe what’s being shown. The example video above shows a custom service (still in development) designed to make Final Fantasy 1 accessible and playable by blind users. The AI service feature has included new changes to allow closer integration between the service selected and the game being played, allowing the service to read and press gamepad buttons along with the current screen image.
0 Comments
Leave a Reply. |