{"id":2125,"date":"2012-04-16T23:43:17","date_gmt":"2012-04-16T23:43:17","guid":{"rendered":"http:\/\/dalelane.co.uk\/blog\/?p=2125"},"modified":"2012-04-17T00:01:43","modified_gmt":"2012-04-17T00:01:43","slug":"has-today-been-a-good-day","status":"publish","type":"post","link":"https:\/\/dalelane.co.uk\/blog\/?p=2125","title":{"rendered":"Has today been a good day?"},"content":{"rendered":"<p>Last week, I <a href=\"http:\/\/dalelane.co.uk\/blog\/?p=2092\">came up with a quick hack<\/a>, explained quite neatly by <a href=\"http:\/\/twitter.com\/crouchingbadger\">@crouchingbadger<\/a>:<\/p>\n<blockquote class=\"twitter-tweet\">\n<p>Dale Lane&#8217;s TV watches him. It knows if he&#8217;s happy or surprised or sad. This is amazing. <a href=\"http:\/\/t.co\/MRsfflPr\" title=\"http:\/\/dalelane.co.uk\/blog\/?p=2092\">dalelane.co.uk\/blog\/?p=2092<\/a> (via @<a href=\"https:\/\/twitter.com\/libbymiller\">libbymiller<\/a>)<\/p>\n<p>&mdash; Ben Ward (@crouchingbadger) <a href=\"https:\/\/twitter.com\/crouchingbadger\/status\/190835758913961984\" data-datetime=\"2012-04-13T16:16:05+00:00\">April 13, 2012<\/a><\/p><\/blockquote>\n<p><script src=\"\/\/platform.twitter.com\/widgets.js\" charset=\"utf-8\"><\/script><\/p>\n<p>It was a bit of fun, even if it did seem to convince a group of commenters on engadget that <a href=\"http:\/\/www.engadget.com\/2012\/04\/05\/webcam-programmed-to-capture-your-face-while-playing-xbox-gauge\/#disqus_thread\">I was a rage-fuelled XBox gamer<\/a>. \ud83d\ude42 <\/p>\n<p>There&#8217;s one big limitation with the hack, though: I don&#8217;t spend that much of my day in front of the TV. <\/p>\n<p>It&#8217;s interesting to use it to measure my reactions to specific TV programmes or games. But thinking bigger, it&#8217;d be cool to try a hack that monitors me throughout the day to measure what kind of day I&#8217;m having. <\/p>\n<p>I don&#8217;t spend much time in front of the TV, but I do spend a *lot* of time in front of my Macbook. And it has a camera, too! <\/p>\n<p>What if my MacBook could look out for my face, and whenever it can see it, monitor what facial expression I have and whether I&#8217;m smiling? And while I&#8217;m at it, as I&#8217;ve been <a href=\"http:\/\/dalelane.co.uk\/blog\/?p=2113\">playing with sentiment analysis recently<\/a>, add in whether the tweets I post sound positive or neutral. <\/p>\n<p>Add that together, and could I make a reasonable automated estimate as to whether I&#8217;m having a good day? <\/p>\n<p><!--more-->I couldn&#8217;t reuse the same Python to control the webcam that I did for the TV. My best way to control the iSight camera on my MacBook seems to be to use QuickTime Java <sup><a href=\"http:\/\/dalelane.co.uk\/blog\/?p=2125#quicktime\">[1]<\/a><\/sup>. <\/p>\n<p>Here is the capture code:<\/p>\n<pre style=\"border: thin solid silver; background-color: #eeeeee; padding: 0.7em; font-size: 1em; overflow: auto;\">package com.dalelane.happiness;\r\n\r\nimport java.io.File;\r\nimport java.sql.Connection;\r\nimport java.sql.DriverManager;\r\nimport java.sql.PreparedStatement;\r\nimport java.sql.SQLException;\r\n\r\nimport quicktime.QTException;\r\nimport quicktime.QTSession;\r\nimport quicktime.io.QTFile;\r\nimport quicktime.qd.Pict;\r\nimport quicktime.qd.QDGraphics;\r\nimport quicktime.qd.QDRect;\r\nimport quicktime.std.StdQTConstants;\r\nimport quicktime.std.StdQTException;\r\nimport quicktime.std.image.GraphicsExporter;\r\nimport quicktime.std.sg.SGVideoChannel;\r\nimport quicktime.std.sg.SequenceGrabber;\r\n\r\nimport com.github.mhendred.face4j.DefaultFaceClient;\r\nimport com.github.mhendred.face4j.FaceClient;\r\nimport com.github.mhendred.face4j.exception.FaceClientException;\r\nimport com.github.mhendred.face4j.exception.FaceServerException;\r\nimport com.github.mhendred.face4j.model.Face;\r\n\r\n\r\npublic class HappinessMonitor {\r\n    \r\n     \/\/ constants for grabbing a photo\r\n     private final static int PICTURE_WIDTH_PX  = 900;\r\n     private final static int PICTURE_HEIGHT_PX = 600;\r\n     private final static String PICTURE_TEMP_FILE_PATH = \"\/tmp\/happinessmonitorcameragrab.jpg\";\r\n    \r\n     \/\/ constants for how often to run\r\n     private final static int POLLING_FREQUENCY_MS = 2000;\r\n    \r\n     \/\/ constants for face.com API\r\n     private final static String FACECOM_API_KEY = \"this-is-my-key-get-your-own\";\r\n     private final static String FACECOM_API_SECRET = \"this-is-my-key-get-your-own\";\r\n     private final static String FACECOM_DALELANE_TAG = \"dalelane@dale.lane\";\r\n    \r\n     \/\/ constants for SQLite used to persist data\r\n     private final static String SQLITE_DB_PATH = \"log.db\";\r\n    \r\n    \r\n     public static void main(String[] args){\r\n          HappinessMonitor monitor = new HappinessMonitor();\r\n          monitor.start();\r\n     }\r\n    \r\n     private SGVideoChannel channel;\r\n     private PreparedStatement insertStatement;\r\n    \r\n     public void start(){\r\n          SequenceGrabber grabber = null;\r\n          Connection dbConnection = null;\r\n         \r\n          try {\r\n               \/\/ prepare camera\r\n               grabber = initialiseCamera();\r\n              \r\n               \/\/ prepare client for face.com REST API\r\n               FaceClient faceClient = new DefaultFaceClient(FACECOM_API_KEY, FACECOM_API_SECRET);\r\n              \r\n               \/\/ prepare database for storing face.com results\r\n               dbConnection = connectToDB();\r\n              \r\n               while (true){\r\n                    \/\/ take a picture with the iSight webcam camera\r\n                    File imagedata = takePicture(grabber);\r\n                   \r\n                    \/\/ upload to face.com\r\n                    if (imagedata != null){\r\n                         Face face = null;\r\n                         try {\r\n                              face = faceClient.recognize(imagedata, FACECOM_DALELANE_TAG).getFace();\r\n                         }\r\n                         catch (FaceClientException e) {\r\n                               e.printStackTrace();\r\n                         }\r\n                         catch (FaceServerException e) {\r\n                              e.printStackTrace();\r\n                         }\r\n\r\n                         \/\/ persist response from face.com\r\n                         if (face != null){\r\n                              storeFaceInformation(face);\r\n                         }\r\n                    }\r\n                   \r\n                    \/\/ wait a few seconds before doing this again\r\n                    Thread.sleep(POLLING_FREQUENCY_MS);\r\n               }\r\n          }\r\n          catch (QTException e) {\r\n               e.printStackTrace();\r\n          }\r\n          catch (InterruptedException e) {\r\n               e.printStackTrace();\r\n          }\r\n          catch (SQLException e) {\r\n               e.printStackTrace();\r\n          }\r\n          catch (ClassNotFoundException e) {\r\n               e.printStackTrace();\r\n          }\r\n          finally {\r\n               closeCamera(grabber);\r\n               cleanupTempFiles();\r\n               disconnectFromDB(dbConnection);\r\n          }\r\n     }\r\n    \r\n    \r\n     private SequenceGrabber initialiseCamera() throws QTException {\r\n          \/\/ initialise quicktime java\r\n          QTSession.open();\r\n         \r\n          \/\/ create the image grabber\r\n          SequenceGrabber seqGrabber = new SequenceGrabber();\r\n         \r\n          \/\/ prepare video channel\r\n          QDRect bounds = new QDRect(PICTURE_WIDTH_PX, PICTURE_HEIGHT_PX);\r\n          QDGraphics world = new QDGraphics(bounds);\r\n          seqGrabber.setGWorld(world, null);\r\n          channel = new SGVideoChannel(seqGrabber);\r\n          channel.setBounds(bounds);\r\n         \r\n          \/\/ return grabber\r\n          return seqGrabber;\r\n     }\r\n    \r\n     private void closeCamera(SequenceGrabber grabber){\r\n          if (QTSession.isInitialized()){\r\n               if (grabber != null && channel != null){\r\n                    try {\r\n                         grabber.disposeChannel(channel);\r\n                    }\r\n                    catch (StdQTException e) {\r\n                         e.printStackTrace();\r\n                    }\r\n               }\r\n          }\r\n          QTSession.close();\r\n     }\r\n    \r\n     private File takePicture(SequenceGrabber seqGrabber) throws QTException {\r\n          \/\/ prepare channel\r\n          final QDGraphics world = new QDGraphics(channel.getBounds());\r\n          seqGrabber.setGWorld(world, null);\r\n          channel.setBounds(channel.getBounds());\r\n         \r\n          \/\/ grab picture\r\n          seqGrabber.prepare(false, true);\r\n          final Pict picture = seqGrabber.grabPict(channel.getBounds(), 0, 1);\r\n\r\n          \/\/ finished with grabber for the moment\r\n          seqGrabber.idle();\r\n\r\n          \/\/ convert the picture to something we can use\r\n          File jpeg = convertPictToImage(picture);\r\n         \r\n          \/\/ cleanup\r\n          world.disposeQTObject();\r\n         \r\n          return jpeg;\r\n     }\r\n    \r\n     private File convertPictToImage(Pict picture) throws QTException {\r\n          \/\/ use a graphics exporter to convert a quicktime image to a jpg\r\n          GraphicsExporter exporter = new GraphicsExporter(StdQTConstants.kQTFileTypeJPEG);\r\n          exporter.setInputPicture(picture);\r\n          QTFile file = new QTFile(PICTURE_TEMP_FILE_PATH);\r\n          exporter.setOutputFile(file);\r\n          int filesize = exporter.doExport();\r\n          exporter.disposeQTObject();\r\n         \r\n          \/\/ check if it was successful before returning\r\n          File jpeg = null;\r\n          if (filesize > 0){\r\n               jpeg = new File(PICTURE_TEMP_FILE_PATH);\r\n          }\r\n          return jpeg;\r\n     }\r\n    \r\n     private void cleanupTempFiles() {\r\n          File imgfile = new File(PICTURE_TEMP_FILE_PATH);\r\n          if (imgfile.exists()){\r\n               imgfile.delete();\r\n          }\r\n     }\r\n    \r\n     private Connection connectToDB() throws SQLException, ClassNotFoundException {         \r\n          \/\/ connect to DB\r\n          Class.forName(\"org.sqlite.JDBC\");\r\n          Connection dbConn = DriverManager.getConnection(\"jdbc:sqlite:\" + SQLITE_DB_PATH);\r\n         \r\n          \/\/ ensure a table exists to store data\r\n          dbConn.createStatement().execute(\"CREATE TABLE IF NOT EXISTS facelog(ts timestamp unique default current_timestamp, isSmiling boolean, smilingConfidence int, mood text, moodConfidence int)\");         \r\n         \r\n          \/\/ create preparedstatement for inserting readings\r\n          insertStatement = dbConn.prepareStatement(\"INSERT INTO facelog(isSmiling, smilingConfidence, mood, moodConfidence) values(?, ?, ?, ?)\");\r\n         \r\n          \/\/ return data\r\n          return dbConn;\r\n     }\r\n    \r\n     private void storeFaceInformation(Face face) throws SQLException {\r\n          \/\/ insert the mood and smiling info from the face.com API response into local DB\r\n          insertStatement.setBoolean(1, face.isSmiling());\r\n          insertStatement.setInt(2, face.getSmilingConfidence());\r\n          insertStatement.setString(3, face.getMood());\r\n          insertStatement.setInt(4, face.getMoodConfidence());\r\n          insertStatement.executeUpdate();\r\n     }\r\n    \r\n     private void disconnectFromDB(Connection conn){\r\n          if (conn != null){\r\n               try {\r\n                    conn.close();\r\n               }\r\n               catch (SQLException e) {\r\n                    e.printStackTrace();\r\n               }\r\n          }\r\n     }\r\n}<\/pre>\n<p>It&#8217;s using a different camera API, there are syntax changes in the move from Python to Java, I&#8217;m using a different <a href=\"https:\/\/github.com\/mhendred\/face4j\">client library for face.com<\/a>, and I&#8217;ve dumped the stuff that was storing information about faces other than mine. But other than that, it&#8217;s much the same as <a href=\"http:\/\/dalelane.co.uk\/blog\/?p=2092#more-2092\">the code that I shared last week<\/a>. <\/p>\n<p>It&#8217;s still dumping the results from the face.com API calls in a local SQLite database. <\/p>\n<p>I&#8217;ve only just written this, so it&#8217;s not been running long enough to capture much data. I&#8217;ll run it for a while then come back and take a look. <\/p>\n<p>I <a href=\"http:\/\/dalelane.co.uk\/blog\/?p=2113\">don&#8217;t seem to be doing well at looking for patterns in personal data recently<\/a>, but I&#8217;m curious to see stuff like:<\/p>\n<ul>\n<li>whether I seem more cheerful in different times of day (I&#8217;m really not a morning person)\n<\/li>\n<li>whether I typically look more cheerful on a Friday than a Monday (should be a safe bet),\n<\/li>\n<li>whether I look more positive when the guy I share an office with is in compared with when I&#8217;ve got the office to myself\n<\/li>\n<li>and so on<\/li>\n<\/ul>\n<p>If I get a chance, I&#8217;m also curious to see what other data sources I can combine it with. For example, <a href=\"http:\/\/www.last.fm\/user\/dalelane\">my scrobbles from last.fm<\/a>. If I&#8217;m in front of my MacBook, I probably have my earphones in and <a href=\"http:\/\/www.spotify.com\">Spotify<\/a> running. Using the <a href=\"http:\/\/www.last.fm\/api\">last.fm API<\/a>, could I see whether certain types of music effect the mood reflected by my facial expression? <\/p>\n<p>Lots of stuff to try once I&#8217;ve got more data. <\/p>\n<p>In the meantime, here is the data from today. <\/p>\n<p>Like before, an interactive time-series chart showing the mood that my facial expression was classified as. The y-axis shows the level of confidence that the classifier had in it&#8217;s evaluation, which I assume probably has some relationship with how strong the expression was.<br \/>\n<script type=\"text\/javascript\" src=\"http:\/\/www.google.com\/jsapi\"><\/script><script type=\"text\/javascript\" src=\"http:\/\/dalelane.co.uk\/blog\/post-images\/120416-timeserieschart.min.js\"><\/script><script type=\"text\/javascript\">google.load('visualization', '1', {packages: ['annotatedtimeline']});    \ngoogle.setOnLoadCallback(drawInteractiveVisualization);<\/script><\/p>\n<div id=\"120416visualisation\" style=\"width: 450px; height: 310px;\"><\/div>\n<p>And a screenshot for the Flash-deprived:<\/p>\n<p><a href=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-mood-confidences.png\"><img decoding=\"async\" src=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-mood-confidences-thumb.png\" border=0\"\/><\/a><\/p>\n<p>&nbsp;<br \/> <br \/>\nAlternatively, we can show the proportion of observations that were classified in each overall expression type:<\/p>\n<p><a href=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-mood-proportion.png\"><img decoding=\"async\" src=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-mood-proportion-thumb.png\" border=0\"\/><\/a><\/p>\n<p>Incidentally, given the reactions to my last post, it&#8217;s worth me repeating this:<\/p>\n<blockquote><p>I copied the labels from the face.com API. &#8220;angry&#8221; is a pretty broad bucket that covers a range of expressions including frustration, determination, etc.<br \/>\nI wouldn&#8217;t read too much into the pejorative labels. I&#8217;m not an angry person, honest \ud83d\ude42<\/p><\/blockquote>\n<p>&nbsp;<br \/>\nHow does this compare to my mood as expressed on <a href=\"http:\/\/twitter.com\/dalelane\">twitter<\/a>?<\/p>\n<p><a href=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-twitter-sentiment.png\"><img decoding=\"async\" src=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-twitter-sentiment-thumb.png\" border=0\"\/><\/a><\/p>\n<p>&nbsp;<br \/>\nAs well as mood, we can look at how many times the code observed me smiling. <\/p>\n<p>Again, we can use the y-axis to show the confidence that the classifier had in identifying whether I was smiling or not. 100% means it was certain that I was smiling, which probably means that I had a big smile. 10% means that it wasn&#8217;t very certain, which perhaps means that I had a very small smile. <\/p>\n<p>I&#8217;m showing smiling as positive values on the y-axis, and not-smiling as negative values on the y-axis.<\/p>\n<p><a href=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-smiling.png\"><img decoding=\"async\" src=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-smiling-thumb.png\" border=0\"\/><\/a><\/p>\n<p>&nbsp;<br \/> <br \/>\nDo I not smile very much? \ud83d\ude09<\/p>\n<p>It looks even worse when you show the results as a proportion of time observed smiling versus not-smiling.<\/p>\n<p><a href=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-smiles.png\"><img decoding=\"async\" src=\"http:\/\/i267.photobucket.com\/albums\/ii311\/dale_lane\/120416-smiles-thumb.png\" border=0\"\/><\/a><br \/>\n&nbsp;<\/p>\n<p>To conclude&#8230;. this was a massively over-engineered, unnecessarily complicated way to answer the question: has today been a good day? <\/p>\n<p>The answer, incidentally, is yes, it was okay, thanks. <\/p>\n<hr \/>\n<p><a name=\"quicktime\">[1]<\/a><br \/>\nYes, I know QuickTime Java has been deprecated. But I didn&#8217;t have the time to learn Objective-C just to try a quick hack, and despite being deprecated, QuickTime Java does &#8211; for now &#8211; still work. <\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last week, I came up with a quick hack, explained quite neatly by @crouchingbadger: Dale Lane&#8217;s TV watches him. It knows if he&#8217;s happy or surprised or sad. This is amazing. dalelane.co.uk\/blog\/?p=2092 (via @libbymiller) &mdash; Ben Ward (@crouchingbadger) April 13, 2012 It was a bit of fun, even if it did seem to convince a [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[7],"tags":[522,525,527,251,526],"class_list":["post-2125","post","type-post","status-publish","format-standard","hentry","category-code","tag-face","tag-face-com","tag-isight","tag-lastfm","tag-qtjava"],"_links":{"self":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/2125","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=2125"}],"version-history":[{"count":0,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=\/wp\/v2\/posts\/2125\/revisions"}],"wp:attachment":[{"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=2125"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=2125"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/dalelane.co.uk\/blog\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=2125"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}