deepdream roundup

July 4th, 2015

If you missed deepdream in the news then go and read this article and the original research blog post, and / or look at the original gallery full screen.

Good. Now, given that Google made the software open source, there’s lots more to look at. Check out the #deepdream hashtag on Twitter.

And this Twitch channel let’s you “live shout objects to dream about”.

And finally, these two videos are worth a watch:

Journey through the layers of the mind from Memo Akten on Vimeo.

Noisedive from Johan Nordberg on Vimeo

[Edit]

Also, someone ran it on a clip from Fear and Loathing…

  
Google’s new Photos app seems pretty great, with a consistent experience between the web and its native Android and iOS versions. The way your photos are organised is better than in Apple’s app, but the clincher is that they give you unlimited online storage if you’re willing to have them compress the originals. Given that (for me) this is just for family snaps, that is fine.
My iCloud storage has been full for weeks, and a combination of not being bothered enough to get round to it and not being sure I want to pay for the service (5GB feels tight, given I recently spent £ hundreds on a new iPhone) has led me to leave it like that. So goodbye iCloud Photo Library.

And as a it happens you can still post photos to iCloud shared libraries (which are, confusingly, separate from the iCloud Photo Library) direct from the Google Photos app. 

Anyway two days into using it a couple of things are eluding me:

  1. A lot of people are tweeting about how impressive the facial recognition is, and the feature was demonstrated in the Google IO Keynote, but my Google Photos app (and also on the web) has no mention of faces anywhere and no apparent means of manually tagging faces – despite my library being full of photos of my family. Perhaps they’re rolling it out incrementally.
  2. Google has rather cleverly tagged and grouped a load of objects and things such as cats, cars, trains and food. However these collections contain some notable mistakes. A photo of one of my cats sleeping has appeared in the ‘food’ set, for example. Oddly there seems to be no way of untagging these things. Surely if you could then this could theoretically help its learning algorithm.

I’m guessing these things will be sorted out in due course, but there’s a chance I’m just missing something obvious. I’ve searched Google and Twitter but can’t find anyone else with the same problem (I mostly care about the face recognition).

Anyone else?

The Web vs native apps

May 21st, 2015

Back in 2010 Sir Tim Berners Lee warned about the threat posed to the web by Facebook et al.

Yesterday Jeremy Keith made this timely post (thanks to @fjordaan for tweeting it) about how poorly-performing websites are fuelling the shift towards native apps. In case you missed it, Facebook – which has already created a closed content silo – recently launched Instant Articles, which is basically their proprietary presentation mechanism for external content that is (presumably) be pre-cached to enhance the speed of the experience.

Rather than taking you to the external site they’re keeping you on Facebook, which is obviously good for Facebook, but you can’t argue with the fact that sometimes the user experience of external news sites is pretty terrible, so users will understandably like Instant Articles.

I’ll not repeat Jeremy’s points so read his post.

As an aside (from me), Jeremy makes a valid point about the rise of JavaScript frameworks being a contributing factor to the problem. I’ve long argued about the appropriateness or otherwise of single-page-application sites. The truth is that there is a time and a place for them, but they are not necessary for delivering content quickly on the web. People often lose sight of this.

In a previous guise I remember arguing against going full-single-page-app in favour of ‘proper’ indexable content URLs on a project. And for keeping the number of requests on those pages down to a minimum (and, yes, making those requests super speedy via, minification, caching et cetera).

This is all well understood good practice, and yet a BuzzFeed article I just tested triggered 335 individual server requests. And one of the reasons I don’t like WordPress particularly is that out of the box (and with most of the popular themes) it leads to bloated request-heavy pages. There’s no culture of optimisation around it, yet WordPress seems more popular than ever (Yes, this site is WordPress; it’s good at doing blogs).

This all said, I have spent most of the last 18 months years building complicated AngularJS-based single page application Milk. However, the reasons why a JavaScript framework is appropriate for Milk are:

  1. It is only for use by logged-in users.
  2. It serves individual user-specific content such as their personal messages. It’s much faster to load the raw JSON data of a message than to reload an entirely new document with all its assets.
  3. It provides live status updates on some items.
  4. Our caching and local storage strategy ensures that users only load the application framework once, even though they may visit hundreds of pages within the app over the course of a week.
  5. And even then, our uncached page load is only 242KB (on a mobile device) and 18 requests, many of which are asynchronous.

It’s an application not a website, it just happens to use web technology. This is a very different use-case to a public page of content such as a news article.

The web is natively great at delivering pages of text very quickly. I consider documents and applications quite separately. And I don’t think it’s contradictory to be a cheerleader for both. The trick is, I believe, not to try to make documents more application-like.

Mind you, that ALL said… Although JavaScript frameworks are a problem in some instances, I think the real culprit in the case of the Buzzfeeds of this world, is the amount of advertising and sponsored content adding bloat to their pages. If publishers had spent more time testing their sites on edge and 3G mobile connections maybe we’d not be in this situation where Facebook Instant Articles look set to be a hit.

[Edit]
This article on A List Apart also makes some good points

Bitcasa was (in my opinion no longer is) a very promising cloud data storage provider – a bit like Dropbox except for two practical differences: Firstly the Bitcasa desktop application mounts your Bitcasa drive as a network volume, rather than syncing to a local folder (so it can hold more data than your hard drive). And secondly the data is encrypted both in transit and on the server. They also offered “infinite” storage for a very reasonable fee. In principle it was great.

Rachael has been using it (on my advice) to back up her photography work (~80GB of new images per week), and now has several terabytes of TIFF and RAW files in her account. We’ve been running an automated upload process every evening and had a further 10TB to upload. The data is also on RAID hard drive units but, as it’s business critical information, a remote backup seemed sensible.

Unfortunately on 23rd October Bitcasa announced that they were discontinuing the infinite accounts and were going to be offering a 1TB or a 10TB service for $99 or $999 per annum. For those in the early pricing scheme and with over 1TB of data this amounts to a roughly tenfold increase in annual cost.

“You have between October 22, 2014 and November 15, 2014 to migrate your data”

The other key part of the announcement was that there was a 15th November deadline (just over 3 weeks) to either migrate the account or to download all data, otherwise it would be deleted. That such an unreasonably short amount of time has been given reeks, to me, of some corporate / financial “emergency” measure, but that’s just speculation.

Bitcasa has always felt, in my experience, a bit “beta”: uploads are much slower than with Dropbox and are very processor intensive. This is, I understand, related to the encryption processing but generally (particularly more recently running it on a new computer) it’s been usable. We’ve never had much reason to download files from it though.

Rachael was (grudgingly) willing to upgrade her account to the $999 10TB package in order to buy enough time to find an alternative long-term solution, but it isn’t working. More than 20 attempts to run the account upgrade process have failed with a server error. Several support tickets I raised have not been answered after several days, except one which was marked by them as “Solved” with a generic advice response.

Bitcasa upgrade server error

Awkward indeed… It doesn’t bode well. Maybe they’re just being swamped with user requests but it feels to me like they are going under.

We have therefore been trying to salvage critical data from the account, but the process is slow and unreliable. Despite us having (according to speedtest.net) an 80Mbps download connection speed, downloading 1GB from Bitcasa is taking about 2-3 hours, when dragging the file out of the Bitcasa drive using Finder on the Mac. And more often than not the operation fails after 40 minutes or so.

Bitcasa - Finder error

The alternative – downloading via their web app – isn’t much better. It’s faster but trying to download more than one file at a time results in a corrupted zip file. Not very practical when you’ve got a folder with hundreds of files in it. Even Bitcasa recommend avoiding it (in a support response):

“We recommend not downloading multiple files through the web portal. If one of the file(s) is damaged, it will break the entire zip file. Downloading single files from the web portal should be fine.”

However, this morning I discovered that moving files in the Terminal is much more reliable. A lot of the problems seem to be related to the Finder. It’s going to take right up to the deadline to get all of the data but it is now, finally, just about feasible.

On balance, for us, speed and reliability are more important than encryption for this use-case. So we’re moving the data to Amazon ‘Glacier’ (via S3). Uploading directly to S3 is like a dream compared to Bitcasa, the data is uploading at over 2 megabytes per second.

The sad thing is that we were willing to pay $999 to migrate the Bitcasa account but then technical failures and lack of support simultaneously made it impossible to do this, and destroyed any confidence we had in the system that we would have been paying for anyway.

It looks on the face of it like Bitcasa are moving more towards a business-to-business API-driven service provider but this is basically a big “fuck you” to all their existing customers. If I were one of their investors I would be less than impressed.

The ISS

May 21st, 2014

Live stream from the ISS

At 05.45 on a weekday morning before work in March 2012, I trudged out into the back garden to look up at the sky. It was before dawn and overcast.

I’d signed up for Spot The Station email alerts some months earlier but had since stopped paying attention to them. Then someone I follow on Twitter mentioned that the ISS was due over Cambridge the following day. I don’t live far from Cambridge (not that, given the station’s orbital height is 370km, the exact location is important) so I checked the latest NASA email to find the exact time and where in the sky I should be looking.

The information they provide is incredibly detailed. Here are the forthcoming sighting times for where I live:

iss-sighting-times-PE9

Given the overcast sky it didn’t look promising but in the minute leading up to the specified time the cloud broke up so it seemed like there was a chance of spotting the station intermittently depending on its path.

Something like 20 seconds into the scheduled minute of appearance (just late enough for me to concede that I’d probably got the time wrong), the ISS slipped majestically into view and I stood with my neck craned for the roughly three minutes it took to cross the sky to the far horizon. With the naked eye it’s just a point of light but impressive nonetheless. All remaining cloud broke up ahead of it so the entire transition was uninterrupted. And as it passed the zenith I goofily and self-consciously waved up at the crew.

30 minutes later I was driving, as I did at the time, to Peterborough railway station to catch the East Coast service to King’s Cross. Radio 4 brought news of renewed clashes in the Gaza Strip. Several teenagers had been killed in an explosion.

Sometimes, usually when tired, I’ll slip into an unusually contemplative state of mind where I become fixated on how some remote observer might view our species. This is influenced, among other things I suppose, by Carl Sagan’s Pale Blue Dot. It can be a bit overwhelming to consider our collective daily goings-on in the un-blinkered context of our little planet hurtling through the bewildering infinity of space. This is by no means a unique or uncommon train of thought, I’m sure…

But this was one of those moments. Watching that speeding point of light that morning had brought a sense of marvel at what we can achieve when intelligent, inquisitive people work together. Projects like the ISS and the LHC at CERN are for me great symbols of hope. And yet at the same time the great Shakespearean tragedy of human conflict continues to play out. The juxtaposition of the two that morning forced me to pull over for a moment to compose myself.

Disappointingly, last week, amid rising tensions between Russia and the US over Ukraine, Russia threatened to pull the plug on the ISS partnership.

New Twitter.com features

April 23rd, 2014

Twitter finally updated my profile to the new display format – several weeks after they upgraded my cat. Here’s an almost pointless blog post about what I like and dislike about the new profile design:

  • Overall appearance: Like
  • Massive font size for just certain tweets apparently selected at random: Dislike
  • Front-end build details, particularly the way the profile photo slides up out of the way as you scroll down, to be replaced by the compact in-nav-bar version: Like
  • Pinned tweets: Dislike (because it reduces the beautiful simplicity of Twitter… but I’ll probably use it to promote something)
  • Not showing replies by default: Like
  • Showing non-tweet-based activity in my timeline such as who I followed: Dislike (I think).

That’s it. You don’t care. Good.

Farewell Windows XP?

March 25th, 2014

woe

Microsoft is ending support for Windows XP and will no longer be selling it. But according to this Independent article XP is still installed on a third of all PCs worldwide. Vista sits at just 4% and around 50% are running Windows 7.

I’m the kind of person who feels a low level sense of unease if I’ve not installed all of the available updates to whatever software I’m using – past the point of reason in all honesty. But I’m in the minority here, most normal people just aren’t interested. And they don’t like change.

I was “a PC guy” for many years, having built a few of my own PCs in the late 90s. I was running Windows as my primary OS until February 2005 when I bought a Mac Mini out of curiosity, and found myself directly in the crosshairs of Apple’s business plan (actually it started when they released iTunes for Windows, which I liked and which lead to me buying an iPod).

In the case of Apple they have an agenda to keep selling new hardware, so although their OS updates are improvements there is that accidental-on-purpose creep of hardware demand that means that a given device gets slower over time. And if you don’t upgrade then third party software eventually stops supporting your OS. This cycle is hard to avoid in a commercial world where, for example, a designer running an old version of the Adobe suite will eventually start being sent files they cannot open. So they upgrade their OS and soon feel they need to buy a new Mac. It’s no surprise that OS X Mavericks was free.

But in an isolated environment, such as within a corporation, a given computer will in theory run as well today as it did ten years ago except for failures in hard drives, which are replaceable. A friend told me that his dad is still using an iPhone 3GS running iOS 5 and it’s as fast as the day it was bought. He can’t run many 3rd party apps but he can use email, SMS and make and receive phone calls so why should he upgrade?

Are we early adopters fools for playing the upgrade game? I’d say no, because new software to us is interesting and useful which is justification enough. As for everyone else, getting the long tail to play catch up is likely to give Microsoft headaches for years to come.

Other than avoiding the Vista car crash, how could they have played it differently?