A couple of years ago Malcom Gladwell presented what he dubbed the 10,000 hours rule, which explains that for one to become proficient with a skill, 10,000 hours have to be invested into that skill. Oddly, most tech journalists and bloggers seem to take this for gospel. I'm not going to dissect and argue against Gladwell here, but rather I have a different perception of his argument. I believe that if I invest this much of myself into a certain unique skill, I'm going to grow bored of it and I will need to move on.
I have been developing and coding for approximately this long, it's time to move in a different direction.
Oftentimes when trying to explain to clients and colleagues that even though I am good at coding, I do not care for it as much as I care for what the user experiences. Even clients, faced with my technical knowledge, wanted to push me towards back-end coding, which has nothing to do with visuals or interfaces, and thus is of no interest to me.
When I stumbled upon the diagram above, it made so much sense to me! It was so clear that I could then illustrate to people, even those who have no technical knowledge, what I do and where I what I am aiming to do. I need to move towards the left side of that diagram, and not the right side!
Don't let them put you into a box
I have never been a fan of simply being told what to do without seeing the big picture. This is why I always strived to be included in every step of a project, from the brief to the strategy, from the design to the production.
However, not everyone is at ease with that. I once had a manager that was displeased because I wanted to take the time to sketch out storyboards, he thought that our department should only have cared about development, and that storyboarding should only be done by information architects or designers. I think it was short sighted, if I knew the assets I required to build the project and knew how to share the requirements efficiently, why stay confined into the limited description of a job position?
Another time in another place, a director of production had just been hired. He was in charge of the producers/project managers as well as the technical department. From then on, he chose to limit our input to only technical things, we were not to be part of the creative brainstorms. Why would removing the people who build the creative experiences from the creation process be a good idea?
I recently spoke with a friend about expanding or reorienting my service offer, and he suggested that I should choose one specialty, and stick with only this one, rather than many. He thought being technical, creative, strategy- and user-oriented were too many things at once.
A university teacher of mine once said that we should aim to become like the scholars of the Renaissance and try to learn as much as we can from every field. And that is what I want to do, no matter if others feel threatened because they are comfortable with only one specific skill. Too often have we heard stories of specialists, whether they are developers or factory workers, that are rendered useless once a new trend appears.
Updated service offer
With all that in mind, I reorient my service offer towards user experience rather than front end development. Fabricio Teixeira dubbed the person with a multidisciplinary approach to the field the UX chameleon and I find that image quite apt.
As a user experience consultant, I intend to use my technical knowledge and experience to suggest smart strategies in using technologies to appropriately answer the needs of the user, rather than simply pushing for the latest tech buzzword. I can create documents to help others get a clear understanding of the goals to reach. I aim to work more upstream in the creation process: analyze the client's brief and then work with the strategy and the creative teams.
I have already been drawing sketches, creating user flows and building information architecture for a while, now I am pushing this side of my aptitudes as my main service offer.
Last spring, Float4 started building an installation for Marine Science Magnet High School. This organisation, located in Connecticut close to the Long Island Sound, centers its educational program not only on academic knowledge, but also on natural sciences, mainly marine-related science. I was stoked, because I always wanted to participate into more educational projects, whether for schools or museums. Also, since we were targeting a young audience, we could allow ourselves to design interaction without taking them by the hand and show them how interaction works.
Even though I have posted screenshots in my portfolio–and I hope to get a video of the kids interacting with it eventually–I thought it would be great to share how we built that project, a case study of sorts.
This installation was split between three states of user interaction:
Attract: At first, catch the user's eye and interest. It's at this point that you invite the user to interact.
Engage: Once the user interacts, add some information with which the user can play a bit.
Explore: The user can explore the experience further from there. Offer more content upon the user's request.
These are just general points as to how you get a user to interact with your installation, which we applied for our installation. Let's see how we designed this.
Design and user flow
The original concept was heavily influenced by infographic design, with heavy fonts, flat and bright colors. To be honest, it was a bit of a mess, too many elements at once for the eye to know where to look and how to read. However, the idea was there, we simply needed to refine it into a finished product.
As can be seen in the concept images above, there is a clear disconnect between the design of the map view and the design of the infographic view. There were also lots of elements that needed not be repeated in every state, such as the temperature and other weather information.
I worked with Raed Moussa to streamline the design, keeping only necessary information. He was able to create a cohesive visual environment between the map and the side view.
As for informational elements, we opted to keep them only where they were pertinent:
The attract view contains animated weather information over a map of the Long Island Sound area, with a clear call to action to invite people to touch the screen.
The engage view presents people with some additional information overlaid on the same map that they saw from the attract view. All the information is relative to the map they see: water depth, buoy locations, shipwreck locations and historical events.
The explore view contains a figurative slice view of a coastal region in the same area. Species icons are displayed in locations where said species would usually live, and if the user so desires, more information on each species is available upon touch.
One of the scientific information that we had to overlay on the map was the water depths of the area. This information is usually printed directly on the maps or embedded into ships navigational systems. We wanted to display this information in the way topography maps display heights.
The inconvenient was that the available maps did not provide it directly. Somehow, most of the depths variations on the map were not big enough, most were simply noted by numbers.
So we had an idea: highlight all the numbers with different colors in order to create a heat map. And I mean manually highlight! At first, we ordered the physical maps, we planned on sitting at a desk with markers to highlight the numbers. We looked into how to take a high resolution photo of the map, but the depth numbers came out all blurry. It was a no go.
Finally, we were able to get the digital maps from Nautical Charts Online. From then, I took a couple of days in Photoshop with my graphic tablet to highlight all the numbers according to a range we predetermined. We were able to get a good heat map.
It was still necessary to incorporate the information of this heat map into the design. Raed was provided a vectorized topographical render of the heat map and was able to incorporate it smoothly into the map.
Transition from map to slice view
It was a bit tricky to think of a transition between the map view, which is a flat map of the Long Island Sound area, and the side view, which is a slice view of a coastal area. At the time I was watching Avatar: the Last Airbender cartoon. In its intro, there was exactly what I was looking for! Take a look between 0:12 and 0:28.
This was a great inspiration. Being a fan of old Final Fantasy games, I also thought of how the map was tilted back when the characters would hop onto an airship.
With the storyboard above, Francis Dakin-Côté was able to create the perfect transition we needed between the map view and the slice view. This animation is presented when a user chooses to explore the installation more in depth, providing a logical transition between the states. The same animation is played backwards when going from the slice view to the map view.
As I probably mentioned in earlier posts or tweets, Float4 most often uses RealMotion, its own software. The development was then split in two: the RealMotion graph that handles the interaction and embeds the content, built by Bruno Gohier, and the interface and content, which I built.
Akin to how Scaleform allows to embed Flash content, so does RealMotion. Since the content for this installation needed not to live anywhere else, and also since it was mostly a two dimensional multitouch graphical interface, Flash was the most indicated technology to choose here.
As I did for other projects these last years, I built the application with the RobotLegs framework. I also used AS3 Signals rather than events, which is a charm to work with.
I may have said in other posts that RealMotion can communicate touch points to the application via the TUIO protocol. I was using the TUIO AS3 library, although along the way I ended up finding and fixing some of its memory leaks and other issues. Don't get me wrong though, it's still an awesome library.
The development was pretty much straightforward, however I think I haven't done something this demanding on a machine since the adidas Originals Women's Lookbook. This project is in full HD (1920 x 1080) at 60 FPS and was required to work in multitouch.
The transition animation ended up being a lot more work than expected. Using a 60 FPS video would prove impossible, as it would take too much time to start and skip too many frames. I tried scrubbing the video manually at every frame, which ended up being worse. I also tried loading dynamically all 360 frames for the transition and built a class that would display the animation. This worked smoothly and was quite satisfying visually. The issue was that once all the images were loaded, the application would freeze while trying to add the images to the stage. And this would only work while developing in FDT, as soon as I tried in a browser or within the context of RealMotion, a crash would occur. So I ended up using two 30 fps videos, one forward and one backwards, since the user has to be free to come back to the map state.
It was not my first go at multitouch, so I could apply the discoveries I had made there. The main thing was really about keeping track of the touch ids: once an item is touched, no matter how many times other touches disappear from the surface, the item is still considered touched as long as the first id is still touching the item. It would translate into something like this:
override protected function onReleased(_:TuioTouchEvent):void
if(_.tuioContainer.sessionID == _touchId)
Another touch behaviour that I implemented from the slideshow was the "touch and hold to trigger". You can read a bit more about that logic in previous post. In this case it was used in an area where images and videos could be dragged, a bit like a vertical carousel. It then became possible to distinguish between dragging to change images and triggering to start a video.
Because laser touch detection cannot be as precise as touch screens, we wanted to avoid small UI elements as much as possible. We decided that textfields would be draggable, like on mobile devices, also because it allowed us to remove the drag bar and the arrows. More space for content.
The weather data presented in the attract view is pulled from the Yahoo! Weather RSS Feed, whereas the buoy data is obtained from NERACOOS. Actually, if you ever need to use buoy data, contact these guys, there were thrilled to help us use their service. Even before completion of the project, they showcased the installation in their newsletter.
All other historical and animal data is entered in a CMS by the users (students or teachers).
The RealMotion graph contained the Flash content and presented it on screen. Since this software is really strong with visual effects, there was no need to try and implement them in the Flash application. We opted for a water ripple effect, which made sense in the context. As the user touches or swipes the surface, water like waves are created across the screen. In the map view, we also added a mask so that the waves would ricochet off the shores of the coast.
In order to manage the content, we provided the client with a CMS (Content Management System). I believe that this project can live long and evolve according to the students' work. Rather than provide the client with a paper manual, I took the landing page of the CMS as an instruction page, and suggested how they could use that tool:
Do not think of this installation simply as a static repository of information, but rather as a dynamic space to display involving and evolving information concerning the Long Island Sound area. Rather than throwing in all the data at once, maybe there could be a theme for each week or each month.
Here are some ideas:
A student's essay receives a great grade: if a student's essay on a subject that is pertinent to the installation (Long Island Sound area, history or sea life) is well written and worth sharing with the school, present it by adding items in the appropriate sections with excerpts of the essay.
Shipwreck week (or month): showcase only shipwrecks in the Engage View section by deactivating all items that are not shipwrecks.
Shellfish week (or month): showcase only shellfish species in the Explore View section by deactivating all items that are not shellfish.
Obviously, these are simple examples, but it's up to you to engage the student in using the installation and maybe write content for it to be shared with fellow students.
As you can see, I believe that as container as well as content creators, we may have a responsibility to inspire our clients on how to use the tools we create for them. Sometimes it's obvious (product display for business), but in this case, I thought it would be best to also guide teachers and educators. These people already have a lot on their plate, let's make sure what we create answer their needs and doesn't complicate their work.
I built the information architecture for the CMS and gave this information to WLAB, the service providers who developed most of the CMS. They used CakePHP, a framework I didn't know then. To be honest, I found this framework too bloated and convoluted for the needs of this project, a fully custom CMS would have been preferable, quicker and more flexible. The providers became overloaded with other projects, so I asked Simon Arame to take over and help us finish the work.
The installation itself consists of two separate screens, each with its own computer. One of these computers acts as a server for the CMS, as mentioned above. Each screen is encased with a PQ labs frame, which detects touch points. This hardware is handled by RealMotion.
The installation is somewhere in the high school. As I say at the top of this post, we will eventually film people using the installation. However, as the school is located on the path of hurricane Sandy that devastated a lot of the American east coast recently, we will leave them time to sort their priorities before we push for them to let us come in and film.
This long project represents a bit how I like to work: touch every bit of the project, do some user experience design and art direction as well as some project management. It gave me a clear view that I'd rather leave hardcore coding aside and put my energy into UX and storyboarding as much as I can.
I'm proud of this project and can't wait to see where the teachers and students of the Marine Science Magnet High School are going to take it!
Even though HTML5 is quite promising, a lot of us realized that the fragmentation of what video and audio codecs are supported is quite an issue.
While making a mobile version of Float4's website, I tested on an Android phone and obtained a message from the embedded Vimeo video that the H264 codec was not supported. Even though each browser vendor publishes its own matrix of what they do and do not support, it's not always easy to find out that information quickly. Worse, not all clients are tech savvy, and they may not easily be able to provide you with the necessary information to figure out what codecs their browser or device support.
It came to me that I should create a simple tool to detect what codecs are supported in a browser. I was inspired by websites like playerversion.com and supportdetails.com to create codecdetect.com. Each website allows any user to find out technical information about their computer, information that they may not be able to know where to find.
EDIT (2012/11/05): I have added an email details form to make user's life easier (inspired from SupportDetails), and I have also put all the sources available on a Github repository for everyone to see. Enjoy!