IntroductionFrom watches to cars to agricultural techniques, artificial intelligence and innovative software applications continue to affect more and more of the physical world. These new technologies are integrated into everyday life to increase efficiency and ultimately give people the freedom to spend more time doing what they want. Virtual assistants like Alexa and Google Home have changed the way people interact with their environment. A person’s voice is now all that is needed to learn about the weather or turn the lights on and off. While smart home technology is more convenient than previous innovations, it still requires active participation from the user through verbal commands. An ideal smart home device would already know what the user wants and would make decisions based on that knowledge. Additionally, this device would sense changes in the physical environment, such as temperature, weather conditions, or health changes experienced by the building’s inhabitants. The device would even be able to update the building configuration to create an optimal response with minimal human interaction time. Updating the building configuration might involve changing the thermostat, opening or closing a window, or sounding a security alarm. Research in this field is necessary to explore how different technologies can be used to create smarter buildings. |
My research project, Project AURA, begins to ask questions about what we should expect from buildings in the coming years. The buildings of the future will enable people to minimize their power usage and environmental impact while simultaneously maximizing their human comfort and security. For example, a smart building could turn off its lights whenever there is no one inside, which could decrease energy consumption by at least 35–40% and up to 75%.1 Imagine the positive impact this would have on energy efficiency if every building could be equipped with this smart technology.
Methods
Project AURA centers on creating smart-building technology that is both functional and elegant; it explores alternative approaches to voice recognition to expand the ways that a person can interact with their building space. A focus on human-centered design is fundamental to the creation of an interactive building that people will want to use. The final version of this project was featured in the iMotion Exhibition within the Stark Galleries in the Texas A&M Memorial Student Center from May to July 2018. Using different human interactions to signal specific structural responses, this exhibit allowed visitors to personally experience this project and envision what a future smart-building might look like.
Project AURA implements interactive window blinds by combining robotic and architectural techniques. These interactive blinds stem from an architectural design completed by Dr. Negar Kalantar and Dr. Alireza Borhani, faculty members at Texas A&M University, in which the window blind structure opens and closes similar to the way a flower’s petals fan out when it first blooms. The end goal was to create multiple blind structures, or AURA structures, that would move automatically based on sensory input. One AURA structure is defined as one of the square window pane structures seen in Figure 1.
To begin this project, I needed to find a way to automate Dr. Kalantar’s and Dr. Borhani’s manual window blind design, so I began by creating an automatic response in one AURA structure using sensory input. After multiple ideas and iterations, two servo motors were placed under the base of two of the fan blades, and an Arduino microcontroller was programmed to rotate the servo motor to a certain degree to open or close the AURA structure. Reliability testing was performed on the motors to ensure that they would not malfunction or overheat during a full day of use in the exhibition. I explored different motion, hand gesture, proximity and LiDAR sensor technology to see how I could program the Arduino microcontroller to create a unique opening or closing of the AURA structure based on sensory input from a person. A distance and hand gesture sensor and a LiDAR sensor were initially chosen as the final sensors to be incorporated into the AURA controller system. The distance and hand gesture sensor was ideal for sensing the speed and direction of a person’s hand waves. Both of these capabilities provided greater freedom in the AURA control scheme options. LiDAR sensors send out light energy in the form of a pulsed laser like a radar or sonar system to sense how close a person is to the sensor and how many people are present.2 LiDAR has a large sensing distance of 40 meters, which was vital for determining where people are in a room. One possible configuration of the AURA structures and the LiDAR sensor can be seen in Figure 2. |
The second phase of this project consisted of designing an array of six AURA structures that could communicate with each other to create macro response settings. One of the major challenges with this project was designing the system to run consistently for the entire two-month exhibition. Due to the exhibition running for 11 hours a day for 61 total days, I wanted to create a system that minimized the number of times the structures opened and closed whenever no one was present in the room to mitigate any reliability or wear on the structures and motors. This lead me to focus on a macro setting mode enabling a person to generate and control a response in all six of the structures by accepting a hand gesture to cascade open or close all six structures. The control system can dynamically change the open or close direction immediately following a new gesture. The LiDAR sensor was not incorporated into the final design due to the inconsistency of the sensor to start up correctly and concerns about increasing the rate of opening and closing of the structures due to the added sensory input. The final system layout of the six AURA structures is shown in Figure 3.
Being the sole researcher on this project, I programmed each of the microcontroller options and validated their successful operation. Processing sensory data and moving multiple servo motors at the same time was an extremely complex task. I used a timer function to control how often the servo motor rotated after a given number of seconds (it normally rotates one degree at a time with a timing delay). I also set a direction and end-goal angle state for each motor that could be updated based on the sensory input. The final result of this complex system was an array of six AURA structures that people can simply interact with to easily open or close the structures.
Results
The array of AURA structures, combined with an Arduino microcontroller, encouraged human interaction by creating an interesting and aesthetic response that the user can control. The beauty of this design is that the user has multiple options for controlling the array of AURA structures and that the Arduino microcontroller can update the system in real time by continuously processing sensory data from its interactions with people in the room. As a result, the user feels as though the building is alive and responsive. While people enjoyed opening and closing these blinds, the array of AURA structures still functionally act as blinds; however, the person can control them in a way unlike anything previously created. During the exhibition, people of all ages and backgrounds interacted with these AURA structures and saw a glimpse of what their future might look like if their homes or businesses utilized these structures. It was amazing to see people’s faces light up once they could see my AURA structures move and see the progression of the AURA architecture design within the exhibition. My project was a centerpiece of the exhibition as it was the only piece to move and was the culmination of years of iterative designs. My role for this exhibition was vital in bringing these designs to life and taking them to the next level as they became fully functioning structures. This progression can be seen below in Figure 4.
In talking with people at the exhibition, I could feel their excitement as they imagined new ways these AURA structures could be used within a building or home. This excitement was contagious as people from other parts of the museum were drawn to the AURA structures to see why and how the structures were moving. The most popular questions I was asked were “What is next for this project,” and “When can I buy one?”
Advantages
While my research specifically focused on one aspect of creating an interactive building, it does show on a small scale the capabilities of current mechanical and software technologies. Increasing the quantity and size of the AURA structures utilized by one microcontroller has a minimal effect on the system complexity. On a larger scale, the AURA structures could be placed in the side of smart buildings to act as blinds and drastically decrease a building’s energy costs. The AURA system makes it easier to close the blinds on a hot, sunny day. This simple action could decrease the heat gain of a room by 45% and increase user productivity and overall happiness of building inhabitants through its interactive nature.3 Furthermore, this project will gather feedback on how people like to interact with their building space, which will be valuable to ongoing research projects specializing in human-computer interaction in the fields of architecture and robotics.
MY GOAL WAS TO START THE CONVERSATION ON HOW BUILDINGS CAN BE MODIFIED |
Future WorkWith this project being part of the Center for Infrastructure Renewal, my goal was to start the conversation about how buildings can be modified to beautifully interact with their inhabitants while decreasing energy consumption. Researchers at the Center for Infrastructural Renewal find innovative solutions to some of the nation’s toughest problems surrounding construction, material design and infrastructure systems. Project AURA takes the ideas of a futuristic building and brings them into the physical world. There still remains work to be done in the creation of a complete interactive smart building. However, I believe that this project will be a launching pad as we continue to look at how buildings can adapt to improve the quality of human life. |
Zachary Kitowski '19Zach Kitowski is a graduating senior mechanical engineering major with a minor in computer science from Colleyville, Texas. Zach performed the research for this article under the supervision of Dr. Dylan Shell and plans to continue it by attending graduate school beginning in the fall 2019 semester. Zach’s eventual goal is to launch his own company after the completion of his education. |
AcknowledgmentsI thank Dr. Kalantar and Dr. Borhani for giving me the opportunity to expand upon her building structure design. I thank Dr. Dylan Shell, a computer science faculty member at Texas A&M University, for giving me the opportunity to work in the Distributed AI & Robotics Lab. I also thank my friends Ryan Hernandez, Shelby Cielencki, Meagan Gimbert, and Austin Keith for their support and help in installing the AURA structures in the gallery. References |