Posts

Classroom Internet of Things

Image
Fragment of Sense script that updates and reads a Xively feed Recently I have been working on getting the Sense platform to work with xively - a cloud service for sharing data from Internet of Things (IoT) devices.  This has been in the context of a project funded by the UK's Technology Strategy Board, called DISTANCE: Demonstrating the Internet of School Things – A National Collaborative Experience . The project will use a number of different IoT devices, from scientific data loggers, air quality monitors, weather stations and the OU's Sense board to demonstrate how learning can be enhanced by IoT technologies.  Working with schools across the UK, we installing these devices in class rooms, playgrounds (really, wherever the students / teachers want) and making the data streams they generate available a xively feeds.  This allows the schools to use each others' data in ways that allows students to carry out experiments using real data gathered from sites a...

Engineering Adaptive Software Systems (EASSy) Workshop

Image
My introduction video for the EASSy workshop I am headed to Japan next week to participate in the NII-Shonan Workshop on Engineering Adaptive Software Systems .  This event will bring together an international group of software engineering researchers who are working on the challenges of realising adaptive systems.  I am looking forward to attending this year, having missed out on the first edition which I helped organise last year !

Best Paper Award at EICS 2013!

Image
Best Paper Award - EICS 2013 My research student  Pierre Akiki , whom I supervise with Yijun Yu, presented his work on Role-based User Interface Simplification at the ACM SIGCHI Symposium on Engineering Interactive Systems this week.  This work is part of his research on a   model driven architecture for adaptive enterprise user interfaces , and it was great to see it recognised by the EICS organisers with the "Best Paper Award".   It was my first time at EICS and I was really impressed by the range of work, the interesting discussions and excellent presentations.  There is a lot of interest in adaptive systems that can enhance the users' experience of a system, and my conversations with a number of people highlighted potential directions of future work that might lead to some useful collaborations.  Other than the announcement of our award, particular highlights of the week for me included: A keynote on " Using the Crowd to Understand and Ad...

Engineering Adaptive User Interfaces @ EICS'13

Image
Cedar Studio: RBUIS Demonstration My research student Pierre Akiki , whom I supervise with Yijun Yu, is working on a model driven architecture for adaptive enterprise user interfaces .  The overall objective of this work is to allow developers and IT operations staff to configure enterprise applications to perform run-time adaptations that make the user interfaces of these systems easier to use depending on the users' context. We will be describing three different aspects of this work at the upcoming ACM SIGCHI Symposium on Engineering Interactive Computing Systems ( EICS 2013 ) - namely a role-based approach to user interface adaptation; some initial ideas for crowd sourcing user interface adaptations; and a tool demonstration of the Cedar Studio - an IDE that supports the design and implementation of adaptive enterprise user interfaces.  Details of the papers are as follows: Akiki, Pierre; Bandara, Arosha K. and Yu, Yijun (2013). RBUIS: simplifying enterprise ap...

Code Club Session 2 - Witch-whacking

Screenshot of a completed CodeClub 'Whack-a-Witch' project The second session of CodeClub at St. Berndette's Primary School in Milton Keynes took place yesterday. We managed to sort out most of the technical problems so things got started a lot quicker this time.  As planned, the children spent the first half of the session finishing off their 'Felix and Herbert' games, with those that finished early helping their friends.  There was also a lot of 'tweaking' going on to the scoring code, sprite costumes and sounds - which was great because it made the children really feel that they'd created their own personal version of the game. Once everyone had completed the first project, we moved on to the next worksheet - the 'Whack-a-Witch' project.  This time I just handed out the worksheets and let them go at it.  Now that they were familiar with the Scratch interface and got the hang of using the colours to help them find different program blo...

CodeClub Session 1 - what a blast!

Image
CodeClub materials prepared for Session 1 I delivered my first session of CodeClub yesterday at St. Bernadette's Catholic Primary School in Monkston Park.  We took a while to get started properly while some technical issues with laptops not being fully charged or user accounts not working were getting sorted out.  We used the time to get everyone signed up to the club with a nice name badge.  There are 15 children in total (8 boys, 7 girls) and a few of them had done some little bits of programming before (e.g., scripting animations and games similar to CargoBot). I didn't hand out the worksheets right at the beginning because I wanted to do some step-by-step walkthroughs to get the kids familiar with the Scratch environment.  However, it was apparent that the children were really excited about getting to program their own game and impatient to get started.  So once they'd figured out the basics of putting blocks together, we handed out the worksheets an...

Role-based User Interface Simplification

Image
Demonstration of Role-based User Interface Simplifcation My research student Pierre Akiki , whom I supervise with Yijun Yu , is working on a model driven architecture for adaptive enterprise user interfaces .  The overall objective of this work is to allow developers and IT operations staff to configure enterprise applications to perform run-time adaptations that make the user interfaces of these systems easier to use depending on the users' context.  In the above video he describes an approach for using user roles as a way of specifying adaptative behaviours that will 'simplify' a given interface.  We define simplification as the process of selecting the minimal feature set required by the user to complete a task in a given context and then optimising the layout of the UI elements associated with this feature set. Update (11 March 2013): This work has been accepted for publication at EICS 2013 and I will post more details once we've finalised the camera-read...