-
Type:
Task
-
Status: Resolved
-
Priority:
Major
-
Resolution: Fixed
-
Component/s: None
-
Labels:None
It would be a good app if we can provide accessibility feature in Liferay. DJ was looking at Java speech api and if we can make Liferay understand commands in human's speech, it would be a beneficial for a certain demographic:
Resource: http://cmusphinx.sourceforge.net/wiki/tutorial
Your tasks is to read about this API, and create a summary of what are its capabilities and at high level how it is programmed. Your opinion of how difficult of it is to use it.
I think we can try to use in Liferay to understand voice commands like:
- Go to home page or site X
- Tell me my upcoming meetings (from the Calendar portlet)
- Things like those
Resource: http://cmusphinx.sourceforge.net/wiki/tutorial
Your tasks is to read about this API, and create a summary of what are its capabilities and at high level how it is programmed. Your opinion of how difficult of it is to use it.
I think we can try to use in Liferay to understand voice commands like:
- Go to home page or site X
- Tell me my upcoming meetings (from the Calendar portlet)
- Things like those