Carnegie Mellon University
June 15, 2020

Changing the Way the World Shops

INI Students Place First in Autonomous Checkout Competition

By Jessica Shirley

Waiting in line is a universal part of any shopping experience. Yet machine learning and AI are poised to transform the retail industry through autonomous checkout, and a team of Information Networking Institute (INI) mobility students are leading the way.

Yixin Bao, Xinyue Cao, Chenghui Li and Mengmeng Zhang placed first in the inaugural AutoCheckout Competition, organized by AiFi Research and Carnegie Mellon University. The competition brought together researchers to develop better, faster and more accurate autonomous checkout software solutions. It featured eight teams from four continents and took place online as part of the Cyber-Physical Systems and Internet-of-Things (CPS-IOT) Week April 21-23, 2020.

“As far as we know, we’re the first to open source an end-to-end solution for an autonomous store,” said Chenghui Li. “We are so proud to make this contribution to the industry.”

team3.gif

The INI team formed through a project in Professor Pei Zhang’s Mobile and Pervasive Computing course at CMU Silicon Valley.

The hands-on course explores research issues in the newly emerging field of mobile computing.

In the competition, teams were asked to create a system that could generate receipts for each customer based on recorded sensor data and an inventory database in a cashierless convenience store. A receipt is considered correct only if all items in the estimated receipt match the items in the ground truth receipt.

“In one semester, the INI team was able to develop and implement a sensor fusion approach that beat out seven other teams in the competition,” said Professor Zhang, associate research professor in the INI and Electrical and Computer Engineering (ECE). He was particularly impressed by how the team achieved 84.4 percent accuracy for the overall precision in real-word scenarios, which is a close approximate to the reported accuracy of self-checkout performance.

How It Worked

team99x2.5.gif

The system was run on over 60 scenarios involving one to three shoppers. Teams were given two days to debug and fine-tune their systems on these sample test cases.

For example, in the above scenario, the team generated the following receipt:

============= Our Predicted Receipt =============
Customer ID: 14322669897997084492
Purchase List:
9 x Boomchickapop Sweet & Salty Kettle Corn
6 x Boomchickapop Sea Salt Popcorn
6 x Skinnypop Popcorn
F1-score: 97.6%

 

“In those 2 days, we worked from day to night and kept improving our system’s accuracy. We also had engineered efficient automation in our system to streamline the whole submission process,” said Mengmeng Zhang. “The four of us really had come up with a great end-to-end system that just works.”

The COVID-19 pandemic posed a few hurdles for the competition. Very quickly, organizers moved the event online and competitors adjusted their teamwork approach.

“Ever since the shelter-in-place order, we transitioned to online meetings and they were just as fun as the in-person ones,” said Yixin Bao. “The circumstances didn’t handicap our teamwork. With all those online collaboration tools out there, we remained productive.“

The Future of Retail?

According to the organizers, Americans spend roughly 37 billion hours each year waiting in line. Not only could autonomous shopping enable us to skip the line, but it could also offer greater convenience through 24/7 shopping, real-time stock analysis and greater insights into shopping behavior. By fusing artificial intelligence, sensor technology and computer vision, researchers are designing new approaches and driving the growth of autonomous retail.

“From my humble opinion, autonomous stores are the future of retail,” said Xinyue Cao. “Imagine going into an autonomous store, grabbing what you want, and then walking out, smoothly and with no lag.”

Competition Awards Ceremony

 INI Team Abstract

“Multi-Person Shopping (MPS) for Cashier-Less Store”

Team: Yixin Bao, Xinyue Cao, Chenghui Li and Mengmeng Zhang
Backed up by the rising of Artificial Intelligence and Internet-of-Things, autonomous checkout is rapidly advancing and has tremendous opportunities in the near future. Two key components of a cashier-less store would be inventory monitoring and customer-merchandise association. Regarding these two fields, we built an end-to-end multi-modal solution for autonomous checkout. To achieve inventory monitoring, we constructed a sensor fusion framework through weight sensors and knowledge of item arrangement. To achieve customer-merchandise association, we extracted 3D human keypoints from cameras to assign merchandise to corresponding customers. Our solution could solve various complex shopping scenarios including multi-person shopping, products lyft and put back, and products misplacement, etc. We won the CPS-IoT week 2020 AutoCheckout competition with a F1 score 84.4%.

The team's code is now open sourced at Github

About the Competition

aifibanner.jpg

Organizers
  • João Diogo Falcão (AiFi Research & Carnegie Mellon University)
  • Carlos Ruiz (AiFi Research)
  • Hae Young Noh (Stanford University)
  • Pei Zhang (Carnegie Mellon University)
  • Shijia Pan (UC Merced)

Photo Credit Source: https://www.aifi.io/research

Autonomous retail has the potential to change the way people perceive shopping in a similar way e-commerce did. Autonomous stores could offer the convenience of 24/7 operation close to the customer, eliminate friction (e.g. waiting in line to pay), monitor stock in real-time and better understand human shopping behavior. In recent years, several automated retail technologies have been proposed. However accuracy and cost effectiveness of these approaches have been a major bottleneck preventing large scale deployments and their study. This competition aims to bring industry and academia closer together by reducing the barrier of entry for researchers to access data and infrastructure. This will allow the community to design new approaches and compare their performance under similar conditions. Learn More