KitchenByte is a prototype of a gesture-based interactive platform designed for a smart kitchen. Built with a combination of existing technologies, KitchenByte attempts to show that adding visual information, even in a very familiar environment, can be beneficial to the user.
The resulting prototype uses a ceiling mounted projector to display a mock-up of a kitchen countertop and stove area. Two webcams were positioned to capture the interface area, using OpenCV for object detection, recognition and tracking. The prototype was programmed in Java using Eclipse IDE. Java Swing was used to create the graphical user interface (GUI). The webcams looked for red or black objects on the x-y coordinates of the interface to mimic more complex gestural interactions and object recognition.
With a swipe gesture, the user chooses a recipe from among a list (only two shown for the prototype), and a photograph of the first step of the recipe is displayed with an instruction: put the pan on the stove. By putting a pan on the mock stove, it begins to “heat” and shows this with a timer and progress bar. When it is ready, it displays the second instruction, which is to put the food into the pot. The user pours the food into the pan (uncooked macaroni in this case), prompting a timer and progress bar. When the food is ready, the interface notifies the user with text, another photograph, and a change in colour.