[LRA-108] Research Speech API integration Created: 21/Mar/14  Updated: 13/May/14  Resolved: 13/May/14

Status: Resolved
Project: Liferay Apps
Component/s: None
Affects Version/s: None
Fix Version/s: None

Type: Task Priority: Major
Reporter: mcalvo Assignee: mcalvo
Resolution: Fixed Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Resolution Date:

 Description   
It would be a good app if we can provide accessibility feature in Liferay. DJ was looking at Java speech api and if we can make Liferay understand commands in human's speech, it would be a beneficial for a certain demographic:
Resource: http://cmusphinx.sourceforge.net/wiki/tutorial

Your tasks is to read about this API, and create a summary of what are its capabilities and at high level how it is programmed. Your opinion of how difficult of it is to use it.

I think we can try to use in Liferay to understand voice commands like:
- Go to home page or site X
- Tell me my upcoming meetings (from the Calendar portlet)
- Things like those
Generated at Sat Feb 10 05:48:36 GMT 2024 using Jira 8.1.3#801003-sha1:6b6f07cffadda9a0d6efe24639daed8ce94dcdd6.