<< return to Vizycam.com

User Tools

Site Tools


wiki:pet_companion

This is an old revision of the document!


Pet Companion tutorial

This tutorial will describe how to create your own Vizy application, accessible by browser either locally or via Internet. After following this tutorial, you will be able to:

  1. Create a custom Vizy application view in Python that you can access from any web browser
  2. Add live video of what Vizy sees (i.e., your pet)
  3. Create GUI elements such as buttons to accomplish special things
  4. Use a TensorFlow neural network to detect your pet
  5. Post images and videos of your pet to the cloud using Google Photos
  6. Send text notifications using Google Hangouts
  7. Make a treat dispenser for your pet
  8. Hook up your treat dispenser to Vizy and control it from a web browser and/or via text notification and/or have Vizy control it autonomously
  9. Share on the Web so you and your pet's other friends can give them yummy treats from anywhere

You will need the following:

- Vizy with a network connection (WiFi or Ethernet) - Computer (Windows, macOS, Chrome OS, or IOS tablet) - Google account - Powered speakers (optional) - Materials for making treat dispenser FIXME: link (optional) - Familiarity with Python

You will not need:

- Knowledge of JavaScript, HTML, or CSS (yay!)

Creating a new application view

One of the simplest ways to add new functionality to Vizy is to add a new view. A view is typically just file of Python code that does the new stuff you want. All you need to do to add new code to the views directory on Vizy, and Vizy's software will automatically execute it when you click reload on your browser. Super simple!

Accessing Vizy's view directory

Vizy runs a full Linux (Raspberry Pi OS) including file system (files, directories, and such), and we need to access the file system to add and edit some files. This can be done by accessing a “network share volume” on Vizy, or you can access the file system using and browser and Vizy's built-in text editor.

Windows

Accessing Vizy's view directory from a Windows computer is simple – just bring up a File Explorer window and type in “\\vizy.local” Then double-click on the “views” directory to access its contents.

MacOS

Access Vizy's view directory from the Finder. Click on Go in the menu bar at the top of the screen, then select Connect to Server. Enter smb://vizy.local and connect as Guest. The select the views directory to access its contents. If you have trouble connecting, this page describes some configuration steps, such as enabling file sharing on your Mac.

Chrome OS

Most Chromebooks support network file sharing through the Files app. Others require the Network Files Share Application. You want to add an SMB file share with hostname vizy.local and share name views.

Strictly through browser

If you don't want to bother with opening a network share volume on Vizy, you can use Vizy's built-in text editor FIXME!. Vizy's editor is reasonably sophisticated and allows you to create as well as edit files on Vizy.

Creating the new directory and new source file

Once you have access to the views directory, create a new directory called pet\_companion inside the view directory. This is where all of our application's code and resources will live. And from within pet\_companion directory create a new file called pet\_companion.py using your favorite text editor (or Vizy's built-in text editor.) Inside the file copy the text:

from vizy import VizyThreadedView
from kritter import Kvideo

class PetCompanion(VizyThreadedView):
 
    def __init__(self):
        super().__init__("Pet Companion") 
        
        # Grab camera module           
        self.camera = self.kapp.modules["Camera"].camera

        # Layout (just a video window for now)
        self.video = Kvideo()
        self._layout = [self.video]

    # This gets called when we enter our Pet Companion view.
    def view_enter(self):
        # Start camera stream.
        self.stream = self.camera.stream()
 
    # This gets called over and over as long as our Pet Companion view is active.
    def view_loop(self):
        # Grab frame from camera. 
        frame = self.stream.frame()[0]
        # Push frame to the video window in browser.
        self.video.push_frame(frame)

    # This gets called when we switch to a different view.
    def view_exit(self):
        # Stop camera stream.
        self.stream.stop()

    # This returns the layout contents for display in browser. 
    def layout(self):
        return self._layout     

This code is mostly boilerplate, but it still does something useful – it streams video to your browser using decent streaming technology and compression (not the mediocre stuff!).

The code creates a video window (self.video) in your browser by putting it in the layout (line 14). It grabs frames from the camera (line 24) and pushes them to the video window (line 26). It uses VizyThreadedView (line 4) to give us our own thread to run our application code by calling view\_loop() (line 22) as long as our view is active.

Running view

Save the file and bring up a browser tab and point it to http://vizy.local.

You should see the Pet Companion view selector button on the left side of the browser window that you can click on. And when you click on it, you'll be taken the the Pet Companion view where you'll see a live view of Vizy's camera.

Adding a "Call pet" button

Let's add a button to our view to call our pet over.

from vizythreadedview import VizyThreadedView
from kritter import Kvideo, Kbutton

class PetCompanion(VizyThreadedView):
  
    def __init__(self):
        super().__init__('Pet Companion') 
        
        def call_pet_callback():
            print("Calling pet...")
                   
        self.video = Kvideo
        call_pet_button = Kbutton(name="Call pet", callback=call_pet_callback)
        self._layout = [self.video, call_pet_button]

We will add the code that does the actual pet calling later.

Using the App Console

You can bring up a App Console by pointing your browser to http://vizy.local/app_console. From this page you can see print messages and/or any errors that your code encounters.

First, be sure to click on your browser's reload button so Vizy reloads the new code. Now click on the Call pet button. You should see the Calling pet… message on the console each time you click the button.

Hooking the button up to something

Of course, we want the button to do something useful, so let's have Vizy play an audio clip each time you press the button. There are lots of ways to do this. Below is code that does that by launching omxplayer (pre-installed) and playing an audio clip. It will require amplified speakers to be plugged into Vizy's (Raspberry Pi's) audio output port.

You can supply your own audio clip or you can use someone else's soothing voice to entice your pet to come over to your Vizy camera. Almost all formats are supported including .wav and .mp3. You can copy the audio clip into the pet\_companion directory (by using either File Explorer or the Finder) so your code can locate it.

...
from subprocess import run

class PetCompanion(VizyThreadedView):

    def __init__(self):
        super().__init__("Pet Companion") 
...         
        def call_pet_callback():
            print("Calling pet...")
            run(["omxplayer", "hey.wav"]) 

After copying the audio file, and clicking on reload/refresh on your browser, you should be able to click on the Call pet button and hear the audio clip played. Nice work!

Adding pet detection

With the ability to call your pet and stream live video, the Pet Companion application might actually be useful, but another feature that might be nice is if Vizy took pictures of your pet and texted them to you throughout the day.

In order to do this, Vizy needs to be able to detect your pet, and an effective way to detect your pet is with a deep learning neural network.

The code below adds the neural network detector for all 90 objects in the COCO dataset. When an object is detected, a box is drawn around the object and the box is labeled.

    def __init__(self):
        super().__init__("Pet Companion") 

        # Callbacks
        def call_pet_callback():
            print("Calling pet...")
            run(["omxplayer", "hey.wav"]) 

        # Grab camera module           
        self.camera = self.kapp.modules["Camera"].camera

        # Layout (just a video window for now)
        self.video = Kvideo()
        call_pet_button = Kbutton(name="Call pet", callback=call_pet_callback)
        self._layout = [self.video, call_pet_button]

        # Initialize TensorFlow neural network for detecting common objects (cocodataset.org).
        self.detector = TensorFlowDetector("coco")

    # This gets called when we enter our Pet Companion view.
    def view_enter(self):
        # Start camera stream.
        self.stream = self.camera.stream()
        # Load neural network (takes a few seconds.)
        self.detector.load()
        # Reset detected.        
        self.detected = []
 
    # This gets called over and over as long as our Pet Companion view is active.
    def view_loop(self):
        # Grab frame from camera. 
        frame = self.stream.frame()[0]
        # Detect things (pets) using neural network.
        detected = self.detector.detect(frame, block=False)
        # If we detect something...
        if detected is not None:
            # ...save for render_detected() overlay. 
            self.detected = detected
        # Overlay detection boxes and labels ontop of frame.
        render_detected(frame, self.detected)
        # Push frame to the video window in browser.
        self.video.push_frame(frame)

    # This gets called when we switch to a different view.
    def view_exit(self):
        # Unload neural network (free up memory)
        self.detector.unload()
        # Stop camera stream.
        self.stream.stop()

The real work happens on line 34, where we send a frame to the neural network and get back an array of detected objects and their locations in the image. The call to render\_detected() (line 40) overlays the detection boxes and labels on top of the image to be displayed in the browser. Go ahead and give it a try – point Vizy at your pet, or hold a picture of your pet in front of Vizy. You'll notice that Vizy is detecting more than just pets – people, cell phones, books, chairs, etc. are also detected and labeled.

The COCO dataset includes dogs, cats, birds, horses, sheep, cows, bears, zebras, and giraffes, but we'll assume that you're only interested in detecting dogs and cats. To restrict things to dogs and cats, we need to filter out the things we're not interested in. This only requires a few lines of code (below).

    def view_loop(self):
        # Grab frame from camera. 
        frame = self.stream.frame()[0]
        # Detect things (pets) using neural network.
        detected = self.detector.detect(frame, block=False)
        # If we detect something...
        if detected is not None:
            # ...save for render_detected() overlay. 
            self.detected = self.filter_detected(detected)
        # Overlay detection boxes and labels ontop of frame.
        render_detected(frame, self.detected)
        # Push frame to the video window in browser.
        self.video.push_frame(frame)

    def filter_detected(self, detected):
        # Discard detections other than dogs and cats. 
        detected = [d for d in detected if d.label=="dog" or d.label=="cat"]
        return detected 

Texting pictures of your pet

The code below will upload pictures of your pet to Google Photos and then text you a link of the photo through Google Hangouts.

    def __init__(self):
...
        # Initialize TensorFlow neural network for detecting common objects (cocodataset.org).
        self.detector = TensorFlowDetector("coco")

        # Cloud stuff
        self.text_media = self.kapp.modules["Google cloud"].hangouts_text
        self.image_media = self.kapp.modules["Google cloud"].photos
        self.sending = False # flag for throttling

        # Register callback for when image is finished uploading.
        @self.image_media.callback_sent
        def sent(index, url, type):
            self.sending = False
            if url is not None:
                # Send text notification with picture URL via Google Hangouts.
                self.text_media.send("New pet " + type + "! " + url)

...

    def view_loop(self):
        # Grab frame from camera. 
        frame = self.stream.frame()[0]
        # Detect things (pets) using neural network.
        detected = self.detector.detect(frame, block=False)
        # If we detect something...
        if detected is not None:
            # ...save for render_detected() overlay. 
            self.detected = self.filter_detected(detected)
        # Overlay detection boxes and labels ontop of frame.
        render_detected(frame, self.detected)
        # Push frame to the video window in browser.
        self.video.push_frame(frame)

        # Upload pet picture to cloud.
        if len(detected)>0 and not self.sending:
            self.sending = True 
            self.image_media.send_image(frame)

This code is fairly simple. I mean, we're uploading pictures of your pet to the cloud (after using a neural network to detect said pet) and then sending text notifications, likely to your phone. Let's pause for a moment to appreciate the wonders that technology has brought us… OK, we do the actual uploading of the photo on line 38. And there is a simple mechanism to “throttle” the photo uploading with the self.sending variable (lines 9, 36, and 37) so that you don't get 10 text messages in a second. And every time a photo finishes being uploaded, the sent() function (line 13) gets called, which will send a text message with the photo's URL. I should mention that in addition to getting text notifications about new photos, all photos go in the main photo album for later perusal by you and others.

Texting videos of your pet

Photos are great, but videos are better. Vizy can determine when it's captured a video clip of your pet and then text you a link to the video clip.

    def __init__(self):
        super().__init__("Pet Companion") 

        # Parameters
        self.atten = 0.4 # Decrease to reduce false positives for video trigger.      
        self.threshold = 0.8 # Decrease to make video trigger more sensitive.
        self.start_shift = -1.0 # Seconds of video to pre-record
        self.duration = 5.0  # Seconds of video to record
        self.trigger = 0 # Video trigger state

...

    def view_enter(self):
...
        # Start video, timeshifted 1 second in the past (start_shift=-1).  
        self.video_clip = self.camera.record(start_shift=self.start_shift, duration=self.duration)
 
    # This gets called over and over as long as our Pet Companion view is active.
    def view_loop(self):
...
        # Upload pet picture to cloud.
        if len(detected)>0 and not self.sending:
            self.sending = True 
            self.image_media.send_image(frame)
            
        # Upload pet video to cloud.
        if not self.video_clip.recording() and not self.sending:
            self.sending = True 
            self.image_media.send_video(self.video_clip)
            # Restart video clip recording.
            self.video_clip = self.camera.record(start_shift=self.start_shift, duration=self.duration)


    def filter_detected(self, detected):
        # Discard detections other than dogs and cats. 
        detected = [d for d in detected if d.label=="dog" or d.label=="cat"]
        self.trigger = len(detected)*self.atten + self.trigger*(1-self.atten)
        if self.video_clip.recording():
            # If trigger>threshold, we have an "interesting" video for cloud. 
            if self.trigger>self.threshold:
                self.video_clip.start()
        return detected

    # This gets called when we switch to a different view.
    def view_exit(self):
...
        # Stop recording video clip
        if self.video_clip.recording():
            self.video_clip.stop()

This code is a little more complicated. We start recording on line 16 with a negative start_shift parameter (line 7). This will keep 1 second of video pre-buffered in memory so that when we decide to start recording (on line 41), we already have that 1 second recorded. This way we don't miss any footage of your pet, because our detector has about 1 second of latency. Lastly, when the video is done, after 5 seconds of video (self.duration, line 8) have been recorded, Vizy will encode and upload the video clip (line 29).

There is also is some logic to determine when your pet is in the video. This happens on lines 37-41. self.trigger is a slow-to-ramp-up value that we apply a threshold (self.threshold) to estimate when there is an interesting video clip of you pet to upload. The rate of ramp-up is determined by self.atten (line 5). It's assumed that when self.trigger reaches a certain value (line 39), your pet has been in the video, and it's not just a false-positive detection. Right after we initiate the video upload (line 29), we start a new video (line 31), so we're ready for the next video.

The parameters (self.atten, self.threshold, self.start\_shift, and self.duration, lines 5-8) can be adjusted/tuned to maximize the likelihood that a good video clip of your pet is captured.

Adding a treat dispenser

Being able to access Vizy remotely and give your pet a treat would be nice, right? Such a feature requires some kind of treat dispenser. Fortunately, making a treat dispenser for your pet isn't too difficult FIXME: link. (And isn't your pet worth it?)

After following the assembly and connection instructions FIXME: link, you're ready to add a “Dispense Treat” button to the code.

    def __init__(self):
...
        self.dispenser_io_bit = 0 # I/O bit that the solenoid is hooked to
...
        def dispense_treat_callback():
            print("Dispensing treat...")
            self.dispense_treat()

        # Grab modules           
        self.camera = self.kapp.modules["Camera"].camera
        self.power_board = self.kapp.modules["Power/IO"].controller  

        # Set bit 0 of IO for high current output to control treat dispenser.
        self.power_board.io_set_mode(IO_MODE_HIGH_CURRENT, self.dispenser_io_bit)
        self.power_board.io_write(self.dispenser_io_bit, 1) # Turn off solenoid.

        # View layout
        self.video = Kvideo
        call_pet_button = Kbutton(name="Call pet", callback=call_pet_callback)
        dispense_treat_button = Kbutton(name="Dispense treat", callback=dispense_treat_callback)
        self._layout = [self.video, call_pet_button, dispense_treat_button]

...

    def dispense_treat(self):
        self.power_board.io_write(self.dispenser_io_bit, 0) # Turn on solenoid
        sleep(0.5) # Wait a bit to give solenoid time to dispense treats 
        self.power_board.io_write(self.dispenser_io_bit, 1) # Turn off solenoid

Here, we add another button to the view. The callback function dispense\_treat\_callback() (line 5) calls dispense\_treat() (line 25), which just applies power to a solenoid that controls the flow of treats.

Adding a "text treat" option

Being able to press a button to dispense a treat from a web page is useful, and being able to text the word “treat” via Google Hangouts is useful as well. The code below implements a text handler that does exactly that:

    def __init__(self): 
...
        # Register callback for when we receive a text message.
        @self.text_media.callback_receive
        def receive(message):
            if message=="treat":
                self.dispense_treat()
                self.text_media.send("ok")

This code registers a callback with the text media object, so that when you text the word “treat” to your Vizy camera, treats are dispensed, pets are made happy, and you get an “ok” confirmation.

Adding "auto treat" option

Maybe you want your Pet Companion to give your pet little rewards throughout the day. One simple way to accomplish this is to dispense a treat every time self.trigger exceeds a certain value. Simple, yes? We'll leave this as an exercise for you. There are more sophisticated ways of doing this also… These kinds of problems can be fun to solve! (Note, your pet will likely forgive you if your code crashes.)

Share your application on the Web

The Pet Companion application sorta wants to be accessible via the Web, and not just via your local network, so that when you're out and about you can see video of your little guy/gal and give him/her little rewards whenever you like.

Fortunately, Vizy makes it easy to share your applications seamlessly on the Web – no need to log into your router and “punch holes”. Click on the Web sharing button on the left side of browser tab.

If Web sharing isn't enabled, click on “Enable Web sharing” to enable it. The public URL should appear in the ''Public URL“ field shortly. You can use this URL to access your Pet Companion from anywhere on the Web, even if your Pet Companion is behind a firewall or router. This works by connecting Vizy to a public relay server which forwards requests from clients (Web browsers) on the Internet. OK, what's next?

Ideas and improvements

Being able to give your pet treats remotely is pretty great… being able to play fetch with your pet remotely is even better. You can make a robotic fetch machine/ball launcher (although we haven't found a good guide for this), or you can buy one. (Are you surprised at the number of choices out there?) Making Vizy control these fetch machines is part of the fun (we think). And it isn't difficult – hopefully this guide has helped with the how, if not the why.

So imagine yourself standing in line at the grocery store and instead of texting your human buddies, you bring up Pet Companion, call your little guy/gal over (it isn't like they're busy), give them a treat, and engage in a quick round of fetch, then give them some more treats for good measure.

Wrapping up

You've just made an application with video streaming, deep learning, cloud communication, interactivity, IOT and robotics – nice work! Give yourself a pat on the back (and treat) for being such a good pet owner.

And, of course, your pet thanks you!

Here's the complete source code: FIXME!

wiki/pet_companion.1601152148.txt.gz · Last modified: 2020/09/26 15:29 by vizycam