top of page
About

About

Roundup is a collaborative game that mixes augmented reality and physical interaction where the goal is to herd sheep and protect them from getting eaten by wild foxes.

How to play?

The objective of the game is to herd as many sheep as you can into a safe zone. Two or more people are needed in order to play the game. A top down view of the environment is displayed on a horizontal screen which the players gather around. One player takes the role of a farmer that tries to move all the sheep on the map into a safe zone. That player can see the game environment in 3D through an augmented reality display on a mobile phone and can move the farmer with a joystick control on the phone's screen. The sheep follow the farmer as long as he does not move too far away. The other player/s help the farmer by placing bridges over rivers and fences to steer the sheep and keep them away from hungry foxes. They can also clear away clouds that will eventually gather and occlude the view. To do so they simply place physical objects representing the action they want to do (e.g. a bridge for placing a bridge and a fence for placing a fence) on top of the screen showing the topdown display. The game ends when there are no wild sheep left on the map. How many did you manage to safe?

Demo

Project Demo

Making Of

Making of roundup

Goals & Motivation
Technologies

Goals & Motivation

We think it is interesting when different modalities come together to create a unique gaming experience. We were particularly interested in making an asymmetric collaborative game where different players use different modalities to help each other reach a common goal. In our game we combine augmented reality with tangible interactions to create a game that is interesting in terms of graphics as well as interaction. We were also motivated to develop an AR application as well as exploring image/object recognition as none of us had any previous experiences in those areas. The asymmetric nature of the game gives player with different roles a different experience while collaborating to reach the same goal. We chose to design our game around the Icelandic sheep roundup as it is a big part of the Icelandic culture.  We hope that by playing our game, people will get a glimpse into the experience of participating in the Icelandic roundups.

Goal 1

Create a functional multimodal game
    Challenge: networking across different devices

Goal 2
Create a well balanced and satisfying cooperative gaming experience
    Challenge: making the game fun across all roles

Goal 3
Create a pleasant visual experience for both player
   Challenge: making the game look coherent

 

Technologies

These are the technologies we used to create the game

Unity.png

We used the Unity Game Engine to develop the game since we all have prior experience with it. Since is a widely used game engine it is easy to find information as well as free game assets online. It is also good for mobile game development. 

Mirror.png

We used Mirror to handle the network communications in the game. Mirror is a frequently used open source plugin for Unity that provides server-client communication which is why we chose to use it. 

Blender1.png

We used Blender to create 3D models for the game and the 3D printed tangible as well as for rigging and animation. We used Blender because it is an open source and relatively easy to use toolset to create 3D computer graphics that also allows for rigging and animation

AR foundation

ARcore1.webp
ARKIT.png

We used the AR foundation framework for the augmented reality part of the game as it is made for Unity and allows you to work with both ARKit for iOS and ARCore for Android. 

VUFORIA.png

We used Vuforia to handle the image recognition in the game. We chose Vuforia as it offers a good image recognition that allows for tracking multiple images simultaneously.

GitHub-Mark-120px-plus.png

We used GitHub for the code management as we all have experience with it and it allows for branching and version control. We also used GitHub to oversee and distribute tasks.

Challenges & Obstacles

The main challenges in the development of the game were to get the networking to work across the different devices, design the game in a way so that it would be fun to play across the different roles and to make the game look an feel coherent in terms of graphics and interaction. Some other obstacles we ran into during the making of the game included synching problems between the server and client, the image recognition not being as precise as we wanted, the placement of the game objects in AR and not being able to build Vuforia on a PC.

Lessons Learnt

During the course of this project we all developed our knowledge of computer graphics, interaction design and game design. We also learnt a lot about AR development and image recognition. Some practical things that we will take away from this projects are to spend more time to integrate all of the individual components frequently as well as to expect the unexpected when preparing for demos and exhibitions.

Related Work

AR in the Gallery: The Psychogeographer’s Table

Juliano Franz, Mohammed Alnusayri, Joseph Malloch, Akshay Gahlon, and Derek Reilly. 2019. AR in the Gallery: The Psychogeographer's Table. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (CHI EA '19). Association for Computing Machinery, New York, NY, USA, Paper LBW0275, 1–6. https://doi.org/10.1145/3290607.3312898

Screenshot 2022-10-31 at 10.33.06.png

The physical installation consists of a large wooden tabletop representing a city’s landscape and streets. They used projection mapping software to display aerial photos and historical maps directly on the tabletop. Virtual models of key historical and modern buildings are “placed” in their appropriate positions and orientations.  Visitors can explore the buildings by walking around the table wearing HoloLens and expand individual buildings using combination of gestures and hand-held button. 

A Smartphone-Based Tangible Interaction Approach for Landscape Visualization

Liviu Coconu, Brygg Ullmer, Philip Paar, Jing Lyu, Miriam Konkel, and Hans-Christian Hege. 2018. A Smartphone-Based Tangible Interaction Approach for Landscape Visualization. In Proceedings of the 7th ACM International Symposium on Pervasive Displays (PerDis '18). Association for Computing Machinery, New York, NY, USA, Article 23, 1–2. https://doi.org/10.1145/3205873.3210707

Screenshot 2022-10-31 at 11.15.07.png

This project uses mobile phones and augmented reality to explore and visualize landscapes. This is done with optical tracking of  fiducials on a physical map. The phone can be used to control a large display by sending position and orientation data from the phone to the host computer as well as displaying AR objects on the physical map. 

Herding Nerds on Your Table: NerdHerder, a Mobile Augmented Reality Game 

Yan Xu, Sam Mendenhall, Vu Ha, Paul Tillery, and Joshua Cohen. 2012. Herding nerds on your table: NerdHerder, a mobile augmented reality game. In CHI '12 Extended Abstracts on Human Factors in Computing Systems (CHI EA '12). Association for Computing Machinery, New York, NY, USA, 1351–1356. https://doi.org/10.1145/2212776.2212453

Screenshot 2022-10-31 at 11.15.19.png

Nerdhearder is a mobile AR game that utalizes the physical position position and movement of the phone as a part of the game mechanic. In the  game you take the role of an IT manager that has  to heard his nerd employees back to their cubicles. This is done by adding different things to a fishing  rod to either propel or repel the nerds. The rod is attached to the cameria and controlled by the position and movement of the phone, prompting the players to move around the table. 

Challenges & Obstacles
Lessons Learnt
Related Work

Photo Gallery

Gallery
Credits

Credits

Low Poly Fence Pack by Broken Vector // low poly 3D models of fences - link to asset

Character Free low-poly 3D model by omarbenlakhdar // low poly 3D model of farmer - link to asset

Flock Box DOTS by Cloud Fine // used for sheep and fox behaviour - link to asset

Sheep Mega Toon Series by Meshtint Studio // 3d models and animations of sheep - link to asset

Low poly plant set Free low-poly 3D model by slidenkoartur // 3d models of rocks and greenery - link to asset

Joystick Pack by Fenerax Studios // virtual joystick for mobile phone - link to asset

Team

The team

Programmer and AR development

Guðrún Margrét Ívansdóttir

In charge of the AR development. Also created a particle system, shader and behaviour for the clouds in the game as well as other particle systems. Made a texture and animation controller for the player. Took part in the design and decision making related to the concept and user experience of the game.

Programmer and Image Recognition

María Ómarsdóttir

In charge of the image recognition. Also created 3D models, 3D printed tangibles and particle systems. Was in charge of user studies and created the UI for the game. Took part in the design and decision making related to the concept and user experience of the game.

Programmer and Project Management

Sindri Pétursson

In charge of project management. Focused on the sheep and fox behaviour, created particle systems and sound effects for the game. 

Also made the demo and making of videos for the project. Took part in the design and decision making related to the concept and user experience of the game.

Programmer, Modelling and Animation

Hallbjörg Embla Sigtryggsdóttir

Created 3D models for the game map. Also created a 3D model,  rig and animation for the foxes. Made a joystick control for the phone for the player movement. Also created this website. Took part in the design and decision making related to the concept and user experience of the game.

Lead Programmer

Kári Steinn Aðalsteinsson

In charge of networking and overseeing all programming related tasks with a special focus  on AR development and image recognition. Also created a water shader. Took part in the design and decision making related to the concept and user experience of the game.

bottom of page