Dirac, a GPL'ed wavelet-based video compression algorithm, reached the 1.0 milestone. I'm not expecting a Apple-official QuickTime component (there is a non-Apple project to create a QT component for Dirac, including encoder), but I'm interested on comparisons with H.264. Anyone?
I don't contribute with code to the git project, so the least I can do is use the master version daily. As I explained previously, I have a automatic process to do that. To keep my git up-to-date, I use my x-git-update-to-latest-version script. This script updates my local clone of the git repo (localy at ~/work/track/git), and then configures, installs (at /usr/local/git-git describe) and updates the/usr/local/git` symlink. This way, I can have /usr/local/git/bin in my PATH and I'm always using the latest version.
The first weekend of September, I went to Barcamp in Coimbra. I was only there for the first day, but I got to know and talk to a lot of people that I usually only read online, which was pretty cool. I gave a presentation entitled WTF is XMPP? mostly centered around non-instant messaging applications of XMPP. I think it went well, the room was pretty full and I talked a lot with people after the event which plan to use XMPP in the short term.
For quite some years now, I have been using some compile.sh scripts to setup my baseline system for each project. If happen to work in Portugal, at one of my previous employers, you might find them somewhere in /servers. (Historical note: the /servers nomenclature was devised between '95 and '97, either at Telenet or IP Global. It later was refined at other companies to include /servers/etc, /servers/logs, /servers/data, and /servers/workspace for specific purposes.
Jack Moffitt was bitten by auto-save. My auto-save setup is "Save when TextMate looses focus" but yesterday I was scripting something better that will be a great auto-save post-script. When I start to work on something experimental, I would like to have a snapshot of every path I take and undo. Sometimes I write some code, and then say "naahhh, wont work ok", and undo it, without any record. And this is bad, because some of those actually were a good path after all.
Interesting read to catch up on current PC architecture. Favorite quote: One developer we consulted about the issue noted, "consumers are being scammed by [PC] OEMs on a large scale. OEMs will encourage customers to upgrade a 2GB machine to 4GB, even though the usable RAM might be limited to 2.3GB. This is especially a problem on high-end gaming machines that have huge graphics cards as well as lots of RAM.
I'm re-factoring an old site where the art of source control went out the window somewhere in the past. The current problem I'm trying to solve is multiple versions of the lib/ directory, each one with their own copies of the same .pm files, but some of them with local modifications. As a first step I want to create a single central lib/ that will take files that are the same on all the other directories.
With a 38 page comic, you can get to know a bit about Google Chrome, the Google browser. Highlights: uses WebKit as renderer;it has his own JS engine, written by Team V8, and it includes a JIT;each tab is a separate "process" running inside a jail or sandbox;Gears is built-in;allows to run Web-based app in a chrome-less window;project completely open-source: I think they mean source available, but maybe I'm a pessimist regarding Google openness.
If you thrive to achieve a stress free life, and keep programming at the same time, I assume that you know how automated testing and test-driven development are an essential tool. I've been using them for most (not all) of what I do in the last year or so. Basic stuff, using Test::More and friends, and more recently Test::Most and using the basic prove tool and Devel::Cover for extra peace of mind.
Bittorrent RAID, cool. He should write academic papers, though.