Lisp HUG Maillist Archive

Cocoa speech recognition

I'm trying to get Speech Recognition to work using Cocoa (on Mac of course) and have put in a couple of hours going through Apple's documentation on speech recognition, Hillegass's Cocoa Programming Book and the LispWorks manuals on the foreign language interface and objective C/Cocoa (some of which of course presuppose that you know the others, as well as C programming).

I'm able to create a recognizer object and have it start and stop listening, but I have not figured out how to set the commands that can be recognized. Apparently this requires creating an NSArray containing the commands as strings. Here's how it looks in objective C (Apple's example):

NSArray *cmds = [NSArray arrayWithObjects:@"Sing", @"Jump", @"Roll over", nil];
        recog = [[NSSpeechRecognizer alloc] init]; // recog is an ivar
        [recog setCommands:cmds];

Here's what I've got in Lisp so far:

(defpackage "SR-PACKAGE"
        (:add-use-defaults t)
        (:use "CAPI" "OBJC" "FLI"))

(in-package "SR-PACKAGE")

(objc:ensure-objc-initialized
  :modules
  '("/System/Library/Frameworks/Foundation.framework/Versions/C/Foundation"
    "/System/Library/Frameworks/Cocoa.framework/Versions/A/Cocoa"))


(defvar *recog* nil)
(defvar *commands* nil)

(defun make-recognizer ()
  (setf *recog* (invoke "NSSpeechRecognizer" "new")))

(defun set-commands ()
  (when *recog* 
    (setf *commands* (invoke "NSArray" "arrayWithObjects:" "one,two,nil")) ; TROUBLE HERE
    (invoke *recog* "setCommands:" *commands*)))   

(defun start-listening ()
  (when *recog* (invoke *recog* "startListening")))

(defun stop-listening ()
  (when *recog* 
    (invoke *recog* "stopListening")
    (release *recog*)
    (setf *recog* nil)))

I haven't shown the various ways I've tried to create the NSArray holding the command strings, including using fli:convert-to-foreign-string. Where I have "one,two,nil" there should apparently be a single object of type objc-object-pointer, if I remember correctly.

Has anyone succeeded with this, or can anyone tell me how to do what's in the first line of Apple's code sample?

I'll also comment that, once this works (assuming!), it seems pretty simple and sweet to harness such a complex technology!

Thanks for your help.

Laughing Water

Re: Cocoa speech recognition


On Feb 8, 2008, at 1:06 PM, Laughing Water wrote:

> Has anyone succeeded with this, or can anyone tell me how to do  
> what's in the first line of Apple's code sample?

(defmacro @ (&body body) `(objc:invoke ,@body))
(defmacro @into (&body body) `(objc:invoke-into ,@body))
(defmacro @pool (&body body) `(objc:with-autorelease-pool () ,@body))
(defmacro @bool (&body body) `(objc:invoke-bool ,@body)) ;; not used  
here but useful

(defparameter commands (vector "Sing" "Jump" "Roll over"))

(defparameter recognizer (@ (@ "NSSpeechRecognizer" "alloc") "init"))

(@ recognizer "setCommands:" commands)

in other words, you get the NSArray creation for free by passing  
setCommands: a lisp vector.

See the docs for invoke (pp. 34 and ff) in "Objective-C and Cocoa User  
Guide and Reference Manual"

regards,

Ralph



Raffael Cavallaro, Ph.D.
raffaelcavallaro@mac.com


Updated at: 2020-12-10 08:44 UTC