An app developed by Cornell researchers uses augmented reality to help users repeatedly capture images of the same location with a phone or tablet to create time-lapse videos – without leaving a camera behind.
Time-lapse photography, which involves combining photos taken over long periods of time, offers a powerful way to visualize phenomena such as the changing seasons or the movement of the sun. Traditionally, photographers left a camera on a tripod for the duration of the event, but researchers working with Abe Davis, assistant professor of computer science at Cornell Ann S. Bowers College of Computing and Information Science, have come up with a more practical method. Their iOS app, ReCapture, is now available for free in the Apple App Store.
Researchers believe this is the first application designed to create time-lapse videos from portable devices.
Ruyu Yan ’23, a computer science major from Cornell Engineering who was the lead developer of the app, presented the work, “ReCapture: AR-Guided Time-lapse Photography,” at the 2022 Association for Computing Machinery Symposium ( ACM) on User Interface Software and Technology on November 1.
The app has three capture modes that cover a range of scenarios. One works best for landscapes, another helps capture close-up scenes, and a third collects a range of images that can be used to reconstruct the scene in offline 3D.
Each capture mode uses different scene information. The simplest mode uses an overlay of previous photos to help the user line up new photos. For close-up scenes, which tend to be harder to capture, the app tries to figure out where the camera is in 3D space and uses arrows to tell the user how to move and tilt their phone towards the correct location.
The work grew out of Yan’s summer research with Davis across the Computer Science Undergraduate Research Program. Yan had mentioned an interest in geocaching, an activity where participants use a GPS to locate a trinket box called a cache, hidden by other geocaching enthusiasts. Meanwhile, Davis envisioned a project that would help field researchers repeatedly find and rephotograph precise locations from their field sites to track any changes. Together they came up with the idea of ”geocaching with images”, which eventually evolved into ReCapture.
“Geocaching might be something people do for fun, but if you’re a scientist and you’re doing field work, then there’s a similar kind of issue at play,” Davis said.
Jiatian Sun, a doctoral student in the field of computer science and Longxiulin Deng ’23, a computer science student at Cornell Engineering, also participated in the study.
Yan said the hardest part was developing the app’s interface to guide users through the process, because “what works intuitively for me may not work intuitively for others.” She solicited feedback from 20 beta testers and also worked with the XR Collaboration at Cornell Tech, which advises researchers on augmented reality, virtual reality and mixed reality applications.
In addition, she had to figure out how to manage the mountains of data associated with the photos. “The app was crashing a lot,” she said. This was a problem because if the app was too slow or constantly crashing, people wouldn’t collect enough footage, resulting in choppy, low-quality videos.
In future versions of the app, Davis believes they might be able to smooth out gaps and abrupt transitions in footage using recent machine learning techniques, resulting in higher quality videos.
Besides creating gifs and videos, the app may also have valuable scientific applications, as Davis envisioned. The team has shared the app with field researchers from other Cornell departments, and colleagues from the School of Integrative Plant Science at the College of Agriculture and Life Sciences have already started using it to collect data.
This work was partially funded through a donation from Meta.
Patricia Waldron is an editor at Cornell Ann S. Bowers College of Computing and Information Science.
#App #creates #timelapse #videos #smartphone #Cornell #Chronicle