Engine Installation

1006140_10103696200920541_1451870057_n

So the recommended way to put this engine in, based on knowledgeable people and GM’s instructions, is to take the car body off. The method is a lot of work, but does makes for a cool photo.

Engine Build Progress 2

Photo-May-30-10-00-37-PMPhoto-May-30-10-00-46-PM

It’s really starting to look like an engine

Photo-Jun-01-1-03-40-AM

Engine Build Progress

I took some pictures while putting pistons in my engine block today.

engine_standengine_with_crankengine_empty_cylinder    engine_ring engine_rings engine_endcap engine_upper_bearingengine_compressorengine_inserted

W500 and the ATI FireGL V5700 on XP64

There is a way to get the ATI FireGL V5700 drivers onto Windows XP 64 bit on the Lenovo W500. It wasn’t easy…

Lenovo doesn’t package drivers for this chip, as far I could tell. But HP does make XP64 drivers. The problem is that they won’t install, unless you alter some files.

Find the ATI video drivers for HP’s EliteBook 8530w. Try to install them. This won’t work, complaining about system requirements.

Go to c:\SwSetup\SP44851\Driver and edit all the INI files you can find here. Change any occurrence of the string 3604103C to 212717AA (I think including in the XP64A_INF directory). Then run Setup.exe and the driver install takes.

I found that after a restart Windows reset the driver to Microsoft’s default VGA. Go to the device manager and do a rollback to the ATI driver and it sticks.

Used Videogames And Why Publishers Make Money

I posted this response on Slashdot to argue against the following reader comment:

…I understand that publishers don’t make any money off used games sales…I get that.

Publishers do make money off used game sales. Not directly, but easy to see if you analyze the system.

Person A buys a game new (ex. $50), plays it, sells it to a used game broker, let’s say GameStop (ex. $20).
Person B buys the used game from GameStop (ex. $40), part of this purchase goes to the broker for facilitating the transaction, part goes to subsidize the original purchase price (the $20 Person A received when selling the game comes from this purchase).

So Person A effectively purchased the game for less money. The lower price for Person A either allows him to purchase the game in the first place (was his perceived utility of the game between $30 and $50?), or leaves leftover money for the purchase of another game (this is his hobby, so more money may end up with game publishers).

So through the secondary market, Persons A and B share the cost. If, as the your hypothetical publisher who doesn’t “make any money off used game sales” argues, Persons A and B would both have bought the game for $50 each, giving them earnings of $100, then the game could have been priced closer to that $100 knowing the secondary market would allow for the cost sharing (let’s say MSRP of $80, giving the broker a $20 piece of the $100 pie). If it wouldn’t have sold for $80 to $100, then both A and B weren’t interested enough to each pay $50, were they?

To put cost sharing another way, my brothers and I would buy a bunch of video games when we were young. The money came from allowance and mowing lawns. To get a $50 game we’d all throw in money and we’d all play the game. If we all had to pay $50 we’d have bought a lot less games, because there wasn’t enough allowance and lawns to mow to get that kind of cash and some games just weren’t worth that much. So is the game studio and publisher losing money? Or are they making even more money? Does it just change the way the industry must operate and market their product?

Here’s the fun question: If cost sharing and a used market didn’t exist, what would the MSRP of a game be? I’d wager less than it is today.

MythTV Install and Export to iPhone

My MythTV backend is now working the way I’d like it to.  I bought a HDHomerun off newegg for the 2 tuners that both handle ATSC and QAM and I like that it’s on the network.  I have an Ubuntu machine running Karmic that I wanted to put the Myth backend on.

The most awesome thing is that there is a MythTV package in apt.  So the install was simple.  Since I couldn’t remember the mysql root password (because how often do you add databases/tables?) I had to override password and set it to something I knew and clear and reinstall the MythTV database package.  With that done, setup was pretty straight forward:  finding the tuners, scanning channels, adding a schedulesdirect.com account ($20 per year for listings since zap2it won’t do that for free anymore).

The next part was exporting commercialless recordings into iPhone format.  There is a package (again through apt) called mythexport that claims to do this.  It handles jobs started from the Myth frontend pretty well, but required a lot of tweaking.  Using a web browser, go to localhost/mythexport and set up some initial settings for what you’re trying to do.  Any jobs created probably fail now and you’ll have to make some changes.  First add the medibuntu repository to apt and update the codecs to regain AAC audio if you’re running Karmic (apparently not a problem before).  The command to reencode the video is in /etc/mythtv/mythexport/mythexport_settings.cfg.  It’s the long line, you can’t miss it.  So change the mp3 library to libfaac.  Try exporting something, /var/log/mythtv/mythexport.log will not show you the exact errors, but will give you the command to run to try again if it did fail (it’s the command starting with “nice”).  Copy it out and try running it yourself with different arguments until it works.

My settings are currently (the  “-ac 2 -ar 48000” was important and I had to add it):

ffmpegArgs=-y -acodec libfaac -ab 128kb -vcodec mpeg4 -b 600kb -mbd 2
-flags +4mv+aic -trellis 2 -cmp 2 -subcmp 2 -s 480x320 -aspect 16:9
-ac 2 -ar 48000

And now it works!

Also note that mythexport adds an Apache2 directory to your configuration.  I had the Apache I got from apt configured as a public facing webserver so I had to lock down those directories with .htaccess files (iTunes will ask for the password when downloading videos in the podcast/rss feed so it’s really not too limiting to do BasicAuth).

Prefix commands with pseudo

So walking to work this morning I figured what the name “otherroute” is about. It’s about going a different way than normal, to be sure, but now there’s more.

The “route” is pronounced like “root”. As in the user root. And since I’m using homophones (some might use the more derogatory word puns) already, I figure the way you act as “otherroute” is to use the “pseudo” command (see “sudo“). Fun, no? It made me laugh.

Back Online!

I moved recently and had all my computers off for quite a while. Now they’re back, and I intend to write about my new MythTV setup. I’ve got the backend running, and need to get a frontend on an AppleTV for my TV screen.

WordPress Online

EDIT 8/10/09: I’ve been told on the ale.org list that ubuntu’s wordpress package has an older version of wordpress and has not been updated recently. So for now I’ll recommend NOT yet using apt to manage your wordpress install.


otherroute.net is back online.  And after doing some work getting a lot of different settings right, I realized that it could have been much easier.

sudo apt-get install wordpress

Yup, that’s all it takes.

My last webserver was an Xbox that ran Debian.  To set that up, I found all the source files I needed, configured and patched things, compiled and installed.  It was a familiar exercise of trying to compile only to find what’s missing, downloading more compressed tar files, and trying to compile everything again.  An afternoon or two later and I had a moderately working Apache webserver to host my personal website, projects for school, and other things that wanted to run on Linux.  And then I learned about apt, which makes life easier by doing all those things for you.  A single command and you’ve got Apache running.

I wanted to avoid that unnecessary work this time around when getting WordPress running on my new dedicated Ubuntu machine.  I was able to make good use of apt.  It found Apache2 for me and installed it.  It found PHP and MySQL and installed them too.  And then I went off configuring things.  I made a database user and databases.  I worked on the /var/www folder tree where I had put WordPress, setting permissions so that Apache could do everything it needed but nothing more.  I turned on Apache’s mod-rewrite when the usual WordPress URL formatting wouldn’t work, and then turned on an Apache permission for the site when mod-rewrite still wasn’t working.  Before calling everything complete, I wrote a script that would download WordPress’ latest.tar.gz and update the site.  And then I found that apt could have done all that and the updates would have been integrated in my usual system update process.

I had again wasted a couple afternoons configuring things that I could have just had the system do for me.  I hadn’t thought about a different way to do things at the time.  I knew how to do get the job done, so I started working the way I knew how.  Almost exactly like last time, after I was done I found that someone had provided me with a much easier way to get things started and in the future keep them up to date.  This time around, I thought I knew the tools available to me, but my knowledge was dated and the end result was the same as last.

But now I’ve got a new strategy that I learned from this exercise that I didn’t learn before.  Before doing things that I know how to do, I need to reevaluate that my known way is the best way.  Things change too quick, and nothing from a few years ago should be applied to technology today.  Sure, knowing the hard way by hand, like configuring WordPress or compiling Apache, helps in understanding what’s going on, but there’s better, popular and well-known ways to get that done.  And check apt for everything, even webapps.