Uploaded image for project: 'Liferay Apps'
  1. Liferay Apps
  2. LRA-108

Research Speech API integration

    XMLWordPrintable

    Details

    • Type: Task
    • Status: Resolved
    • Priority: Major
    • Resolution: Fixed
    • Component/s: None
    • Labels:
      None

      Description

      It would be a good app if we can provide accessibility feature in Liferay. DJ was looking at Java speech api and if we can make Liferay understand commands in human's speech, it would be a beneficial for a certain demographic:
      Resource: http://cmusphinx.sourceforge.net/wiki/tutorial

      Your tasks is to read about this API, and create a summary of what are its capabilities and at high level how it is programmed. Your opinion of how difficult of it is to use it.

      I think we can try to use in Liferay to understand voice commands like:
      - Go to home page or site X
      - Tell me my upcoming meetings (from the Calendar portlet)
      - Things like those

        Attachments

          Activity

            People

            • Assignee:
              mcalvo mcalvo
              Reporter:
              mcalvo mcalvo
            • Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved:
                Resolution Date:

                Zendesk Support