After buying a new Apple TV last year I have been trying to use the Plex App
for all my media needs outside of Netflix. I run the Plex Media Server on an
old Mac with a very small SSD and lots of External HD space which contains my
media. Because of this I needed to do some tweaks to get it to work.
On OS X, Plex keeps all its metadata in ~/Library/Application Support/Plex
Media Server. I shouldn’t have been surprised when shortly after firing it up
I got a low disk space warning as the downloaded metadata for my library
quickly ate through what little space I had remaining on the boot volume.
Fortunately this was quite straightforward to fix:
Shutdown Plex Media Server
Move ~/Library/Application Support/Plex Media Server into a different
location with more available space (in my case /Volumes/Media which is my
From the terminal run the following command:
ln -s "/Volumes/Media/Plex Media Server/" "~/Library/Application
Support/Plex Media Server"
Relaunch Plex Media Server
Ensure you substitute /Volumes/Media/Plex Media Server/ with the correct
location you moved it to, while keeping the second path (~/Library...)
exactly the same. These steps replace the original folder with a
symbolic link to the external
drive. When Plex attempts to access data in the directory, OS X will
transparently point it to the External Drive instead without using up any
precious SSD space.
All was well and I was using this setup for several months until I tried to
play a 1080p file which needed transcoding. Whenever I selected the file Plex
would briefly attempt to load before returning back to the menus. Looking in
the Plex Server dashboard I could see that there was no transcoding in progress
but no other errors. Checking for any settings related to transcoding it seemed
like it would put transcoded files in the external drive so I didn’t think that
was the issue. I figured the Mac simply wasn’t powerful enough to transcode the
video and thought no more of it.
Until last night! I was bothered by the fact that it simply did nothing. If the
machine wasn’t powerful enough I would have expected buffering problems and
even manual transcoding in advance wasn’t working. I enabled Debug logging
under Settings -> Server -> General and took a look in ~/Library/Logs/Plex
Media Server.log and immediately spotted the problem:
Mar 11, 2016 18:40:46 [0x70000021d000] WARN - Low disk space: 7351296544 bytes
source file, 31495843840 bytes capacity, 4046225408 bytes available on
The transcoding was happening outside of the Application Support directory and
instead using another path on the SSD so there simply wasn’t enough available
space. I used the same steps above to move the Transcode directory to the
external drive as well and now transcoding works perfectly.
These were the steps I used to fix the transcoding as well:
When using git I’m a believer in small, meaningful commits – as my colleague
says: “Little commits, pushed often”. In many cases though, this doesn’t
necessarily match how I – or I’m sure others – tend to work. While debugging a
problem you may find yourself fixing other things or making tweaks which are
important but not necessarily related. Personally I may find myself looking at a
diff, ready to commit, realising that the changes within are part of
I use the term “story” in a narrative, rather than agile sense. Meaningful, well
thought-out commits can tell a tale about the engineering process, sharing
valuable knowledge of decisions or trade-offs as they were made. A well written
commit message is far more valuable than documented code. If the name of a
method or its parameters don’t betray its purpose, then the naming is bad. If
the method body doesn’t itself describe the functionality then it probably needs
refactoring. Documentation goes out of date, but commit messages are always
valid as they are by their very nature tied to the state of the code. They make
git-blame useful for more than just “who broke this”.
git add -p
All of this to say that I’m a big fan of interactive staging in git. Being able
to select and stage individual hunks, or even lines is wonderful and I use it
all the time. It’s also a great way to review every single change that is going
into the repository to ensure it is still necessary - a sort-of personal
code-review. Did you leave an #import in that you’re no longer using? You’re
far more likely to notice it while interactively staging.
Unfortunately not everything can be staged interactively. With binary assets it
becomes an all-or-nothing affair and you end up being greeted by this:
$ git diff en.lproj/Localizable.strings diff --git
a/en.lproj/Localizable.strings b/en.lproj/Localizable.strings index
ff37e30..535a260 100644 Binary files a/en.lproj/Localizable.strings and
But wait. Why is Localizable.strings1, a pure text file, showing as binary?
Apple’s documentation on String Resources explains:
Note: It is recommended that you save strings files using the UTF-16 encoding,
which is the default encoding for standard strings files. It is possible to
create strings files using other property-list formats, including binary
property-list formats and XML formats that use the UTF-8 encoding, but doing
so is not recommended.
The problem is that the core git diff tool doesn’t handle UTF-16 data, so any
tools that depend on it – for instance git add -p – will just give up.
Now, sometime in recent history Xcode started supporting UTF-8 encoded
.strings, converting them to UTF-16 at compile time. Depending on your
situation you may be able to convert all of your project .strings files to
UTF-8, commit them and get on with your life.
Unfortunately, although Xcode now supports UTF-8, lots of localisation tooling
including the venerable genstrings still operates on UTF-16. If we use these
tools we have a problem. In my case, the localisation service we use deprecated
their old utility (which allows specifying the encoding), replacing it with
one which only outputs UTF-16.
I discovered that others had encountered the same problem, but the
solution only provided readable diff output. With a .gitattributes
file it is possible to associate certain attributes with files matching a naming
pattern when performing git operations. In this case a diff attribute would
be a shell command which, when running git diff, will be executed over both
the working copy and repository copy. The output from these commands is diffed
instead of the raw file contents. This can yield many interesting results but
isn’t quite what I’m looking for.
I’d never heard of .gitattributes before so I decided to dig a little deeper
and it wasn’t long before I found something far more promising. Along with
diff there is an attribute named filter. Filter follows a similar idea to
diff except that it provides two commands instead of one – clean and
smudge. Git operations which move content between the working copy and the
repository are piped through these filter commands. clean runs any time
a working copy is going to be committed (when changes are added to the index)
and smudge is used whenever content is being loaded into the working copy
(During a checkout or reset operation).
Filter attributes are perfect for our needs. We can keep the repository copy of
the strings file UTF-8 encoded and the working copy as UTF-16 with a filter
attribute converting between them. Because git filters the working copy before
diffing against the repository, we get the benefits of being able to
incrementally stage files while working with a UTF-16 representation of the
That’s the theory, lets take a look at how it works in practice.
First the easy bit. Create a .gitattributes file in your repository and add the
following to configure all strings files to be handled with the utf16 filter.
Next we need to actually define the filter. This is done in the git-config. We
have this configured local to the repository as part of our bootstrap, ensuring
that UTF-16 doesn’t get committed by accident. It would work just as well at the
global level if you work on lots of different projects. The relevant section of
.git/config looks like this.
iconv does most of the magic here. If you haven’t come across it before it’s a
utility to transform text between encodings. Let’s break down the clean command
The purpose of the clean command is to convert the working copy representation
into a format that will be stored in the repository by cleaning it up for
storage. In this case we want to transform the UTF-16 file on disk into UTF-8
which git’s tools are able to cope with. At its simplest this can be
accomplished with iconv -f utf-16 -t utf-8. This works perfectly if you
already have UTF-16 files in your working copy, but there are several cases
where you might have a UTF-8 file on disk instead (i.e. after a fresh clone). In
this case iconv will blindly read the UTF-8 as UTF-16 and happily present you
with a wall of traditional chinese characters!
We aways want to convert to UTF-8, but the representation we’re reading
from can vary so we use file -b --mime-encoding %f to identify the encoding
of the file on disk and use that for the -f parameter instead. %f will be
substituted with the filename by git, -b prevents the filename being
prepended to the output and --mime-encoding output only the encoding, for
instance utf-16be. This is almost what we want - and it will work - but it
will include the BOM in the UTF-8 output. I didn’t really want this, so by
removing the trailing be with sed I end up with utf-16. UTF-8 files always
output utf-8 so there’s no problem there.
Finally, -sc ensures that errors are silenced and unrecognised
characters are discarded.
The smudge command is almost the inverse of the clean command with one
important difference. When the file doesn’t yet exist on disk - possible if
you’re switching between branches or going throw history - the file command
fails with output which then breaks iconv. We get around this by first
checking whether the file exists and defaulting to utf-16 for the encoding if
Setting it up
Although the .gitattributes file exists in the repository, the filter commands
exist in the git configuration which is individual to each machine and copy of
the repository, within .git/config. As the project tooling depends on the
filter behaviour being present I added an additional step to our bootstrap
scripts which adds the filter commands to the local git config.
You could include the filters in your global config to ensure it’s always
available (I haven’t personally done this yet), but given how others would
depend on this behaviour I think it’s best to ensure it will be present in all
If you’ve found this post useful or have any suggestions please send me an email
at this domain. Feel free to put anything before the @ - be creative!
Localizable.strings is a file used to store locale specific strings in
Cocoa apps. ↩
I can be quite an impatient person. Not in the angry sense, rather I prefer to get useless interludes (such as walking somewhere) out of the way as quickly as possible - as a child I earned the nickname “Running Boy” - even now I find myself running to my car most evenings. Fortunately for my driving license I enjoy driving and tend to stay around the speed limit.
The modern technological world doesn’t help my impatience. Why would I buy a CD and wait for it to arrive when I can just buy it on iTunes? In fact, why wait for iTunes to download it when I can, more-often-than-not, just search on Spotify and listen immediately.
I began playing the Piano at the age of 7, having weekly lessons with Mrs. Baker. I learnt the theory aspects very quickly but actually playing pieces was more of a challenge. I used to hate practising. I was frustrated that I couldn’t just bypass all the uninteresting pieces and start playing cool stuff. Eventually - to the dismay of an older self - I quit. I kept the piano though and it went unplayed for years. In fact, it wasn’t until it was put in storage while living in rented acommodation a few years ago that I started to miss it.
I started disappearing into the garage to play. I didn’t know any pieces aside from a few short riffs lurking in my muscle memory, I just wanted to play something. Anything. I started finding sheet music I wanted to play, determined that I would struggle through it. When I moved house again I made sure the piano stayed close by. I began to learn some fairly technical pieces. Nothing particularly advanced, and nothing particularly well, but the key thing was that I enjoyed them.
But my impatience was interfering. It didn’t matter which piece I was trying to learn, I would play things at a certain tempo, if not as fast as possible, and with speed comes volume. Some of the pieces I learnt warranted a certain vibrancy. Others however sounded wrong, I genuinely struggled to slow them down and play them more softly. Around two years ago I decided I would learn Moonlight Sonata. I often played in bursts, practising for perhaps an hour one day and not returning to it for weeks. I eventually started making progress to the point where I can now play it all the way through. But I still play too fast.
When I was younger and started riding a bike I would cycle everywhere flat-out. I always remember my Dad said I should see how slow I can go - that’s where the real talent lies. This seemed crazy until I tried it. Cycling fast smooths out the process, good balance is much less important. Cycle slowly however and fine, precise balance becomes critical, every movement matters.
I realised this evening how this translates to playing the piano. When playing Moonlight Sonata I have been playing at a fairly fixed tempo, to a learned rhythm. When wanting to stress myself I have played it as fast as I possibly can, with gusto, in some misguided effort to get better. This was completely wrong. I sat down this evening and played it much slower, slower than Beethoven intended. It was so hard. In places where I was trying to play exclusively from memory I got things completely wrong. I played all the way through but I felt like I hardly knew it.
In some desperate effort to reassure myself I started from the beginning and played it at my tempo. I played it even faster, I noticed how many mistakes I was making, but it didn’t matter. At this speed I could carry on without losing my flow because I wasn’t really enjoying the music, I was already thinking about the next slew of notes. By learning it to a rhythm I had masked from my brain the notes I was actually playing, my fingers just played them. You might call it muscle memory. But slowing things down disrupted my rhythm. I discovered places where I need to practise more, where my fingers know the movements but my brain doesn’t know the notes.
The thing is, there is a speed limit to how fast I can play piano. Concert pianists can play much faster but they’re still limited by the electrical impulses in their muscles, the rise and fall of the hammers. But there’s no real limit to how slow you can play, provided you play with the minimal force required to strike the strings. Who says you need to finish the piece before you die? Or even before the heat death of the universe.
I took pause for a moment, then I took it from the top. I played it as slowly as I had attempted previously, focussing on how it sounded, the keys I was playing. I think it was the most beautiful piece of music I have ever played. I could absorb the timbre of each and every note, it was marvellous.
Music stirs emotion. My music choices drive my emotions and vice versa. Playing Moonlight Sonata this evening was the first time music has moved me to tears. There’s an underlying beauty that was lost in my ambition to play it as fast as possible, to get it over with, as if getting through the piece and getting on with something else was some kind of achievement to be proud of.
I began to wonder how many opportunities have passed me by to appreciate things around me. I can think of at least one instance recently where my need to explain things quickly to move onto the next thing meant that I didn’t relax and couldn’t fully appreciate the good company I was in. There is usually more than enough time, there’s no need to be in such a hurry, even if my mind is racing. Especially when my mind is racing. It’s all too easy to miss some opportunities, the little things. The things which matter. Life isn’t an elevator pitch, it doesn’t need everything explained in 90 seconds.
When you take a slower approach to something you tend to notice the details. Things which would get otherwise lost in the furore because they appear and disappear too quickly to worry about.
Sometimes slowing things down is harder than speeding up. It’s in these cases that it’s likely worth the effort, to savour the moment, to relax and appreciate it fully. We should remember to enjoy the journey, because ultimately, what is there at the end?
Today Google announced their augmented reality project Google Glass. Essentially a wearable Heads-display, which, according to their concept video will provide a seamless interface for Google services to improve your day-to-day life.
There were rumblings of such a project back in February, supposedly coming out of “Google[x]”, their far-future looking lab and it seems that everything mentioned then has been confirmed, though estimates of late 2012 availability are apparently ambitious.
I think the idea is neat and I look forward to seeing what they manage to produce. However, as many people have pointed out, concept videos rarely depict reality and the concept video really does seem too good to be true. Even setting aside the systems’ interpretation of the protagonists’ “Hrmm” and grunts as commands, many of the context driven actions such as informing him the subway was suspended and re-routing to the book store seem ambitious. Then again, I’m cynical and always seem to underestimate technological capability, so I expect to be at least a little surprised.
On the other hand I am apprehensive to say that I would want to buy one. As far as Google’s services go, I am finding myself increasingly questioning their practices. I saw a brilliant quote on twitter earlier:
Why would the largest advertising company in the world want to place a screen between my eyeballs and reality?
Assuming Apple were to release something similar I’m sure all the same walled garden arguments will apply that people use to compare iOS and Android. People, read: nerds, will want their AR displays to be open so they can hack on them. When Steve first unveiled the iPhone SDK, he said that they were being very restrictive on what was possible with the SDK because the phone needs to be reliable. People don’t want malicious, or badly written software to crash their phone.
I think a wearable HUD takes this to a different level. It’s not a life-dependent system, granted, but if I’m walking along and suddenly get bombarded by some form of audio/visual distraction directly in my field of view as I’m crossing a street that could put my life in danger. Perhaps this is a hyperbolic example, but on a device which, as I gather, is intended to be active the majority of the time - as opposed to a phone which will spend the majority of the time in a pocket - such possibilities are more likely to present themselves.
I’m excited about the technological potential, particularly how they are able to project an in-focus image onto the users retina. I have some ideas how they might to do this but I’ll be interested to see what they’ve done. I’m looking forward to trying the system out, one of my colleagues has already expressed interest in getting one to experiment with in the lab at work.
However if I were to buy one for my own use I’d rather pay appropriately for the experience from a company like Apple or Microsoft than have it funded by advertising.
Plus, do I really want to look like something out of Minority Report? Perhaps. I’ll just have to wait and see.
I mentioned in a previous post about the launch of the new iPad that I had decided my money was safe. At university I became very strict with myself to leave my laptop on my desk, connected to external KVM and as such I used the iPad extensively elsewhere. When I graduated last year, I moved back to my parents house so I could save more effectively for a house deposit. As a result of this I found myself using my laptop as a portable machine again my iPad fell by the wayside.
In the week prior to the iPad announcement - shortly after I wrote the Resolutionary post, I had left mine in the living room and my dad picked it up and started using it. Before long he was looking at second hand prices on eBay, it was clear he enjoyed using it much more than his personal laptop.
When my friend, Mark, informed me that he would attempt to purchased an iPad after work on launch day I decided I would go along for the ride. I was curious to see what the display was like and I got the impression he might need a little encouragement to part with his £500. It was exactly what I was expecting, certainly a very impressive screen, but I was a little underwhelmed as I couldn’t really appreciate the improvements prodding a display model in a busy store. I still didn’t feel I could justify the cost to myself as I already had a perfectly good iPad.
Mark decided he was going to take the plunge, so we approached one of the store staff to ask the all important question - did they have the right model in stock? After all, online orders were showing a 2-3 week lead time across the board. Surprisingly they had plenty, we were promptly escorted out of the store, into the queue system which is always present for launches at our local store. Whilst there was no one in the queue, that was where the staff were standing with stacks of cards, each one representing a unit of stock. I asked about the abundance of stock, while they wouldn’t comment on how many they received, they did point out that what they had would have to last the weekend and they were expecting to be busy.
We were allocated a card representing a 32GB black WiFi only iPad, then taken back into the store to complete the transaction. It was clear that while the store was very busy there weren’t many customers as Mark paid almost immediately. I was very impressed overall, they seem to be getting on top of their launch supply. I can’t see myself queueing overnight for a launch again.
That evening I read lots of feedback on twitter from people getting their new toys and one thing jumped out at me. Someone mentioned that the screen brightness on new iPads could be set much dimmer than previous models and this got me thinking. I have a Kindle which I read on, primarily because I don’t enjoy having my retinas scorched by the iPad screen at night. Having the option of reading on the iPad at night was very appealing though as it is much more versatile.
The following day I visited Mark and spent roughly 2-3 minutes with his iPad, primarily checking exactly how dim the screen could go. I moved some money around, called the Apple Store to check their stock levels and an hour later I was in possession of a Black, 32GB Wifi+4G iPad along with a dark grey Smart Cover.
The Smart Cover, despite its flaws lives up to my expectations and provides a happy medium between bulky case and a naked, easily damaged iPad. I’m ashamed to admit my previous iPad sustained more damage than I would have liked. My dad bought my old iPad off me for the price of my cars Vehicle Excise Duty, which was handy because I’d spent all my money on an iPad.
I find myself using the iPad for much more than I did previously. I happen to be writing this on my Mac because I was working on some code when I decided to write but am I happy overall? Absolutely.
The thing I’m most pleased about is the name. Pundits reached fever pitch this week over what it was going to be called. With the media coverage of the “disappointing” iPhone 4S, complaining that it wasn’t the “iPhone 5” that everyone was expecting, I’m glad Apple seem to have abandoned the idea of incremental naming of their products.
I think Apple, having realised they were on a slippery slope, are beginning to remedy their mistake. Every other product (except the iPhone) has survived with a simple name; iPod Touch, MacBook Air. Each product has an identifer such as “4th Generation” or “Late 2009” but they aren’t used in marketing and that is the key. People don’t want an “iPad 2S 64GB Wifi+4G”. They want an iPad. Everything else is secondary.
Tim Cook is clearly honoring one of Steve Jobs's final wishes: "Fuck with the pundits' ability to guess what our new devices will be called"
If I were in their position that would probably be my philosophy. They’re still going to sell them faster than they can build them. Regardless of its name.
Nevertheless, my money is safe for now. I don’t feel compelled to upgrade and I don’t think I use my current iPad enough anyway. I shall instead save my money to upgrade to the new iPhone. Or I could just go back to saving for a house as I should be doing.