A terabyte HD is now available for just $170.
Many many years ago, there was a BBS called "TeraByte". It had a 1 GB HD, which was much bigger than anything I've seen. At that time, a mid-end PC had 120 MB HD.
A look at some trends. Data is from marketshare.hitslink.com.
|OS||Feb 2007||Feb 2008||Feb 2009|
|Browser||Feb 2007||Feb 2008||Feb 2009|
I have two monitors that are 1600 x 1200. I find it harder and harder to move back to my notebook with 1024 x 768.
More apps now try to squeeze more things onto the screen. The content area is very small as a result.
It's hard to view other people's large desktop when they share it over the network. Good thing the sharing app allows it to be resized.
I reinstalled two of my PCs recently. I installed Windows XP on a Xeon 2.8 GHz PC with 2.5 GB RAM. It is meant to be my "production" server. In hindsight, I should have installed Vista or Windows 7 on it. There's a need to test new OS features from time to time.
I also reinstalled XP on an older P4 2.4 GHz PC with 512 MB RAM. It was running Windows 2000 previously. One factor in choosing XP over Windows 2000 is that XP has 70% marketshare, and Windows 2000 only has around 2%, so there is no longer any reason to keep Windows 2000 around for testing purposes anymore.
I installed IE 7 on the Xeon PC and then found that IE 6 doesn't co-exist with it! (The previous MultipleIEs solution no longer works.) IE 7 has 48% marketshare and IE 6 still has 18% marketshare, so it is still essential to test IE 6. I'm keeping IE 6 on the P4 PC.
I will run in a limited user account most of the time, so that should limit any hacking through browser loopholes.
My tablet PC's stylus stopped working, adding to its long list of problems.
The service centre quoted me S$129 for it, and that's excluding GST. And I thought the most it could cost was $50!
I tried to look for it in SLS, but I was disappointed. There were some other stylus, but not the exact one. I found a second-hand TC4200 stylus at $45, but I decided not to buy it. I want the exact one because it can be "parked" in its garage.
I then went online to look for it. It was readily available on eBay. It was around US$45, and around US$10 to US$15 for shipping.
I decided to get it from the service centre. It's a 3 to 4 weeks wait. In the mean time, I'll use a mouse.
When I return my tablet PC (it's company property), I'll keep the stylus as a momento. :-)
There are five mainstream browsers now: Internet Explorer, Firefox, Safari, Opera and Chrome.
Because of this, I prefer to use Java applets. Write once, run on every browser. However, this requires the user to have Java installed.
I hate to reinvent the wheel, especially for production code. I observed how Microsoft developed its applications. It always evolves its applications, and almost never do an outright change. I've seen how its applications (and OS) started simple and evolved to become everything-with-the-kitchen-sink over several versions. Almost nothing is ever thrown away. (You can tell because a feature or workflow has sort of a theme/signature to it.)
If the legacy code works, keep it, no matter how ugly it is. That's my philosophy. New code is always good, but it will inevitably be full of bugs and requires many rounds of real-world testing to make it production ready.
I don't like to modify legacy code to add features, though. What I propose is an upgrade path. Have the legacy code handle all existing features, and the new code handle new ones. Then slowly migrate the old features over one by one.
I've seen too many rewrites due to new computer languages (non-OO to OO), changes in data format (non-XML to XML), and changes in presentation (table to div, non-CSS to CSS). Most are unnecessary for existing features, IMO.
While most children his age sketch on paper with crayons, nine-year old Lim Ding Wen from Singapore, has a very different canvas -- his iPhone.
Lim, who is in fourth grade, writes applications for Apple's popular iPhone. His latest, a painting program called Doodle Kids, has been downloaded over 4,000 times from Apple's iTunes store in two weeks, the New Paper reported on Thursday.
The program lets iPhone users draw with their fingers by touching the iPhone's touchscreen and then clear the screen by shaking the phone.
"I wrote the program for my younger sisters, who like to draw," Lim said. His sisters are aged 3 and 5.
Lim, who is fluent in six programming languages, started using the computer at the age of 2. He has since completed about 20 programming projects. His father, Lim Thye Chean, a chief technology officer at a local technology firm, also writes iPhone applications.
"Every evening we check the statistics emailed to us (by iTunes) to see who has more downloads," the older Lim said.
The boy, who enjoys reading books on programming, is in the process of writing another iPhone application -- a science fiction game called "Invader Wars".
9 year old, with iPhone? Fluent in six programming languages. Fluent? Six? Not bad if it's true.
The highest capacity SD card today is 32 GB, with 64 GB on the way.
Incredible. I still have a 32 MB SD card with me, from several years ago.
32 GB is enough for most office-type work and surfing net. Without my pictures, music and videos, I doubt I can use even 15 GB (including the OS).
A SD card is just 32mm x 21mm x 2.1mm. A 1.8" form factor SSD is much larger in comparison: 71mm x 54mm x 8mm . (Of course, the SSD is much faster.)
It doesn't take much to see that in the future, SSD will have a much smaller form factor, say 50mm x 20mm x 5mm. SSD do not need a circular disc like traditional HD.
The Intel processors, starting with the i386, use a 4 kB page size. Starting with the Pentium Pro, they support PAE (Physical Address Extension) with 2 MB page size.
IMO, 4 kB is too small and 2 MB is too large.
In the days of the i386, a high-end server may have 128 MB RAM. Today, it is no longer a dream to have 4 GB RAM.
64 MB = 32,768 pages. 4 GB = 1,048,576 pages!
To get back to 32k pages, the pages should be 128 kB. Still, my gut feeling says that this is too large. I would opt for a page size of 16 to 32 kB.
In this day and time, why do we still have to load programs into RAM before we run them?
For a 32-bit OS, there should be a dedicated 4 GB data file on the HD that is the "working copy". The RAM acts as the L3 cache. Whenever a (4 kB) page is accessed, it is paged in as necessary.
An app is "pre-loaded" during its installation, so there is no need to load it into memory to execute it in the future. (Actually, modern OS already do something similar. But I feel they don't go far enough.)
This works well for a small device, where all apps and files are under 4 GB. However, even digital cameras need to access over 4 GB of data today.
We need a universal 48-bit addressing mode. This allows 256 TB of data.
My manager asked me this question a few weeks before he ordered my new PC: which is more important, CPU, RAM or HD?
I replied "memory". Then I added, "it depends". That's because once you have enough memory, it would cease to be a factor. (In other words, you just need enough RAM to avoid swapping.)
This was based on my experience with my Windows PC, where I run several apps (usually 8 to 10) and keep multiple web pages (10 to 30) open all the time. I seldom run CPU-intensive programs. I'm always short of memory. More memory means less swapping, hence increased speed.
It's a different story when it comes to my Linux PC. I seldom use it other than editing and compiling the source code. The most common apps I run are the shell (6 to 10 instances) and the debugger (2 to 4 instances).
Compilation is CPU-intensive. On my old Linux PC, compilation is CPU-bound: 100% CPU usage. This is not surprising because the make program launches at least 3 sub-tasks, so the CPUs are always busy.
With 8 CPUs on my new Linux PC, I thought it would become I/O bound, since we all know that I/O is slow, right?
Wrong, it's still CPU-bound: all 8 CPUs show 100% usage! I/O still has a lot of margin (bursts of 20 MB every few seconds), and RAM is hardly used (600 MB).
So, never guess, but always profile.
My requirements for my father's PC:
The three main difficulties of getting this PC: (a) the budget, (b) the choice of casing and (c) two video-outs.
The budget is stretchable, but still, I don't foresee anything more expensive than $1k (casing + CPU + RAM + HD + PSU + fans).
Slim PCs look mutually exclusive with the two video-outs requirements, but I've only seen two slim PCs: the Acer one and a HP one. SFFs should have able to support two video-outs, but the boxy shape looks terrible.
I thought of buying another piece of 1 GB RAM to upgrade my father's PC while the RAM is still available. I found that it would cost me less than $20!
There's nothing else to upgrade. With 2 GB RAM, my father's PC should be able to last 5 years and perhaps even survive a change of OS.
I then wondered how much a budget PC would cost today. Let's try:
Total $817. Cheaper and faster than 1.5 years ago. That's progress for you.
Note: prices from HardwareZone.com.
The only thing that remains the same is the monitor — although it is $200 cheaper! I like this monitor very much. In addition to its impressive technical specs, its slim borders make it very attractive.
I got a new PC for my father about 1.5 years ago (August 2007). It replaced my father's 8-years old 300 MHz Celeron PC that finally gave up after a long and difficult struggle.
My father's requirement was simple: he just wanted to surf net. So, his budget was very low. However, as the acquisition officer, I tacked on a lot of requirements of my own (even though I didn't use it). :-P
My requirements will be discussed in another entry.
In the end, I got these parts:
The casing was very slim (slightly wider than 3.5" HD), so it looked slick, but it could still hold full height cards. The casing inspired me to build the system.
CPU: I didn't need speed, so I got the cheapest — 2140. If I were to build for myself, I would get the C2D 4400 — another entry level, but at least it's a Core 2 Duo.
M/B: I read a few reviews on MATX m/b and this Asus m/b seemed to have pretty good specs at a good price.
RAM: Cheap. 800 MHz RAM is too expensive. CL4 is too expensive. 2 x 512 MB for dual channel mode is too expensive. How poorly can this perform? I suspect the performance difference is less than 5%.
HD: Cheap. I could have gotten a 80 or 160 GB HD, but it's quite easy to fill the HD with junk, so got something bigger just in case. I got a Hitachi cos it's cheap. I would have gotten the Seagate 250 GB w/16 MB cache for myself.
DVD drive: got a DVD-RW writer before I realized I didn't actually need it! I got a SATA model, but the drive didn't come with a cable and the m/b only came with one, so I had to go back to the shop to buy one at $3 (rip-off price).
Monitor: this was where all the money went to. (Around 40%?)
I got the above for $1,082 by paying cash. Why pay the 2% charge for NETS? $1,000 may seem troublesome to pay by cash, but I didn't use $50 notes. I went to the bank to get one $1,000 note and one $100 note the day before.
Later, I went and replace the really noisy and annoying Castek casing fan with a quiet one for around $20. After the change, the PC was silent enough to be a HTPC.
I reused the old keyboard, mouse, WLAN-USB adapter and speakers. The HD and DVD drive could have been reused, but I didn't bother.
List of problems:
I've going to send the tablet PC for repair once I get back to office. I hope they still have parts.
I lost all three of my favourite programming books: The Pragmatic Programmer, Debugging the Development Process, and Programming Pearls (2nd ed).
It's a good thing I have two copies of The Pragmatic Programmer, so I still have one left. I have no idea where the other copy went. I believe I lent it out. However, I can't remember who I lent it to.
I lost Debugging the Development Process a few years ago and had never found it. This book went out of print a long time ago. Last year, I bought this book off Half.com, so I own this book again. Interestingly, I still have its sister book, Writing Solid Code.
I went to look for my third favourite book: Programming Pearls. I found it after a long search. It was not in any of my usual book spots, but in an totally different place.
What happens is that when the books overflow their pre-allocated space, the new books will be scattered all over the place, until I allocate a bigger space to put the books together. My mother just gathers the scattered books and put them in some obscure place that I don't know.
It looks like I need to overhaul my physical book storage. I have not done the reorganization for years because I have cut down tremendously on buying books.
Some text editors load the entire text file into memory. Such editors are not good for browsing through log files, which can routinely be 50 MB to 100 MB.
As a rule-of-thumb, loading a file should be <5s and use <25% of free memory.
Most OS has no special support for text files, so they are unnecessarily slower.
The OS should allow "partial sector" usage, so that when we insert or delete some text, we don't need to shift the rest of the file. Partial sector means the sector don't have to be used in full.
Suppose we have a file of size 10,000 bytes. With 512-byte sectors, the file is spread across 20 sectors. Now, if we insert just one ASCII character right at the start of the tile, all 10,000 bytes must be shifted by one and all 20 sectors must be updated.
What if the OS supports partial sectors?
The first character is put into a new sector. Only one byte is used. The other sectors are untouched. This wastes some space, of course. However, the OS will auto-merge partial sectors whenever it can, so the wastage is kept at a minimum.
A file contains data. A folder contains files. It used to be so simple. Not any more.
A file can now store other files. Why? Either for mere compression or as a package. It acts more like a folder than a file in this aspect. Some OS allow you to browse into such files.
A file has a data format. This is application specific. Other applications are not able to read the file unless they know the format. While this may sound obvious, the question is, why does each application need to code the file reader/writer? Why can't the code be shared through a common interface?
This will allow every application to read/write to all kinds of image files, for example.
It is time to discard the folders and files concept. We will replace it with the container and stream concept. While it sounds like just a change in name, it's a paradigm shift in thought. A container is more generic than folder. It can be either a real folder or a zip-file, for example.
A real file is often made up of multiple independent parts. However, we tend to impose a physical ordering on them because we tend to think of files as one-dimensional streams. Not any more. We should break the file into independent streams, so that we can optimize their usage.
For example, resizing one section can be a very expensive operation. You need to make space for the section by creating an empty space for it. This could involve reading and writing half the file.
With streams, you just resize one stream. The other streams are not touched at all.
An old timer will know that a filename used to have the 8.3 format: a file name of 8 characters and an extension of 3 characters. This is on DOS, of course. Very old Unix used to have a 14 character limit.
These days, files can have names up to 256 (or so) characters, but they still have an extension and they still cannot store some special characters. (This is true on Windows XP.)
Why should it be the case?
Why isn't the file name a mere attribute of a file? The extension indicates the type of the file and should not be part of the file name at all.
When we rename a file, it should not break any previous linkage to it at all. Almost every OS gets this wrong, even today.
Every file in a drive should have a unique id. Applications should access a file using its id, not its name.
The OS should allow multiple references to a file as well. It should be smart enough to keep just one copy and not duplicate them, wasting space.
A 64-bit (IEEE 754) floating point number can store a 54-bit integer (52 bits, 1 implicit and 1 sign bit), enough for 15 decimal digits. This is enough for most purpose.
If speed is not essential, I think this is a good one-size-fits-all solution.
The floating point hardware unit should detect that if the 64-bit floating point number is in fact an integer (exponent 0), it will use integer operations.
32-bit integers suffice for most day-to-day operations, but there is one area that has already exceeded its limit: disk size. The smallest hard disk today is well over 4 GB. The newest photo cards are mostly bigger than 4 GB.
All OS today use at least 48-bit integers for file size and disk size (max 256 TB). Once, it was inconceivable that files can exceed 4 GB. Even today, very few (kinds of) files exceed this limit, except for databases and videos. However, file systems often exceed this size.
This is not a big deal. File offsets can use 64-bit integers whereas the rest of application use 32-bit integers.
There is a more serious limit: memory. 32-bit memory implies 4 GB of maximum memory. Some OS restrict applications to just 2 GB, but that's a design fault. Obviously the designers never expected RAM to get so cheap.
RAM was very expensive a decade ago. A 8-CPU mainframe serving, say, 50 to 100 users, had only 64 MB RAM. Today, even entry-level PCs have 2 GB RAM.
It is easy enough to install a 64-bit OS to use more than 4 GB RAM. The Intel processors has supported 36-bit addressing since Pentium Pro, allowing 64 GB RAM.
But it isn't so straightforward to rewrite an application to do so. First, it implies a 64-bit application, with 64-bit integers. It creates havoc on compatibility, because legacy code is mostly written with 32-bit integers in mind. It is also almost impossible to use 32-bit integers with 64-bit pointers, since most code (written in C) assume they are somewhat interchangable.
Soon, partitioning and defragmenting a hard disk will be history. This will happen when solid-state disks are in wide use.
Why do we need to partition a single hard disk into several logical drives? There are two main reasons: to install different OSs (with different file systems) and because very old OS (like DOS) could only support partitions up to a certain size.
For example, DOS 2.0 support hard disks up to 15 MB, DOS 3.0 up to 32 MB, DOS 3.3 supports multiple partitions and DOS 4.0 supports up to 2 GB.
There is no longer any need to split up a hard disk into several partitions due to file system limitation. However, there is still one good reason to partition a disk: to reduce disk fragmentation.
Disk fragmentation is a bane. It slows down disk access, and disk access is already a magnitude slower than RAM access.
By partitioning, we can store frequently modified — hence prone to fragmentation — files on one partition, and keep the other mostly static files in another parition. We can even keep infrequently accessed files in yet another parititon so that the active partition is smaller and hence the seek time is less.
However, solid-state disks have constant access time and no seek time. Thus, there is no need to partition to avoid disk fragmentation. In fact, there is also no need to defragment at all due to constant access time!
For now, that is. In the future, it may be possible to power down unused sections of the solid-state disks, so it's still useful to use just part of it most of the time. Hence it may be still useful to partition the disk.
Some things never change.
My current machine: Xeon 2.8 GHz, 1 CPU, 2 cores, 512 kB cache each, 1 GB RAM, 50 GB HD. Takes 60 minutes to build the entire code from scratch.
My new machine: Xeon 3.0 GHz, 2 CPUs, 4 cores each, 2x 6 MB cache, 4 GB RAM, 450 GB HD. Takes 6 minutes.
Wow. Just wow.
The old CPU isn't really that slow. Each core is rated at 5571 bogomips. The new core is 6000, just 7% faster.
I believe the old HD is slow like hell. The new HD is fast, and silent — even when compiling. The whole casing is very silent.
I never had such power on my hands. Let me do some video encoding before they decide to make it a shared server. (There are plans to do so.)
RAM is so cheap these days; S$20 for 1 GB! Programming should be a piece of cake!
Do we need fancy data structures? I regularly encounter arrays and hash tables, very seldom linked list and almost never binary trees, not to mention even more exotic data structures.
In C, after you get past the syntax, there are two things that remain hard: strings and memory management. These days, interpreted languages have two common characteristics: typeless variables and garbage collection. They make it very easy to whip up simple programs.
Here's what I would do if I were to reinstall Windows XP on my Tablet PC (Pentium-M 1.2GHz, 1.5GB RAM and 55.8GB HD).
First, I'm not going to repartition the HD. It has currently three partitions of 9.76GB (App), 9.76GB (Data) and 36.3GB (Archive).
I like to separate code from data, so I'll always have at least two partitions. If nothing else, it makes it easy to reinstall the OS without affecting the data. (Even so, I always back up the essential data — you'll never know when things go wrong.)
I usually have two data partitions. The smaller one, the Data partition, contains the working data. The bigger one, the Archive partition, contains data that don't change much, if at all. The Data partition is of course next to the App partition to reduce seek time. This will become irrelevant for solid-state drives.
The App drive will have 7.26GB free after taking into account the swap file (1GB) and hiberation file (1.5GB). It is enough for Windows and most of the apps that I plan to install. In the past, I started with a mere 4GB. It ran out of space after a while and I had to install apps in another drive. Then, I upsized to 6GB and it still ran out of space after a few months. I finally settled at 10GB.
For Vista, I would make the App drive 12GB to 15GB.
Before installing Windows, it is essential to copy out the current settings:
It is also advisable to keep all the updated drivers and apps you intend to install on hand. They can even be already in the Archive partition.
To make it quicker to get Windows XP up to speed, it is recommended to slipstream SP3 into the XP CD so that you install Windows XP SP3 directly. (This is very easy to do and the instructions can be found online.) After you do this, you only need to apply about 30 patches as opposed to 170+ patches if you install Windows XP SP2.
The first thing I'll do after installation is to move C:\Documents and Settings to D:\DnS. This can only be done by editing the Registry in a series of steps. It's not particularly complicated. Plus, even if you mess it up, there's nothing to lose because it's a clean installation. Again, the instructions can be found online.
Note that the permissions will be wrong and you need to change the owners manually.
It is also possible to tell Windows XP to create the Documents and Settings elsewhere during installation, by using its automated installation mode. However, I've not tried it before.
This must be done before installing any app because they may store hard links to it in their config files.
I will then move dllcache out of the App drive too.
After that, I will install the updated drivers and apply the updates. It's best to do this without going online. Still, I think it's fine if you go only to Microsoft's website to download the updates.
(Note that you may need to configure your wireless settings first. Be sure you're on an encrypted connection.)
At this point, I would turn off all graphical effects, change the swap file settings and turn off unused services. There are online Windows Service guides that tell you which services can turned off.
Now, I would copy most of the cache directories out of the App drive to the Archive drive. The directories include,
These folders must be manually updated from time to time as Windows XP add to them periodically. I'll copy the contents back when they are needed. Otherwise, they stay out of the App drive.
The next thing to do is to create two user accounts: one normal account and one browser account.
The normal account is just a user-level account. It is not a power user and it has no administrator rights what-so-ever. It should be used most of the time. It should only be necessary to drop into the Administrator account once in a while to install drivers and apps.
The browser account is also another user-level account. It is even more restricted: it only has access to its own data directory. The idea is to run IE in it (using Run As) so that any hacks through IE is contained. (The hacker needs to hack through the OS to gain privileged access.)
The first obstacle you'll encounter is that you can't double click on the time on the taskbar. This is because it does not support read-only mode. Luckily, this can be changed via the group policy editor.
The Windows XP installation is now complete! Proceed to install the apps!
After upgrading my RAM, I'm left with just 573MB (out of 9.76GB) on my app drive. As someone who likes to keep his app drive lean, this annoys the hell out of me.
This is the first-level breakdown:
|Documents and Settings||718MB|
The hibernation file went from 512MB to 1.5GB after I upgraded the RAM. I also increased the swap file from 768MB to 1GB, although there is no reason to do so.
I don't intend to clean up Documents and Settings as I intend to move it to another drive. I had already relocated the temp folder and the browsers' cache to another drive.
I install my apps in App rather than Program Files.
The Windows folder accounts for almost 46% of the used space! Some of the biggest contributors:
Note that I have already relocated dllcache to another drive.
The first four entries are either cache or backups. That's almost 2GB there! The next two are .Net stuff: 725MB.
Altogether, the real Windows code/data is at most 1.2GB to 1.5GB.
I bought 1GB RAM for my notebook. To my pleasant surprise, the user-memory slot was empty, so it means I have 1.5GB in all — from 512MB.
The RAM was more expensive than I thought. An online local price site list it at S$55, but the cheapest I found was S$85. The RAM was expensive because it was an old type of RAM (DDR-333). The newer DDR2-667 1GB RAMs cost just $30!
In the end, I bought the RAM for $87. I prefer Kingston over Cosair, although there's no basis to do so.
The notebook is now so much snappier. 512 MB is (just) sufficient for Windows XP, especially if you turn off eye-candy and some useless services, but it's insufficient for Windows XP Tablet. It requires at least 768MB.
I had planned to get another 1GB RAM in the near future. It requires me to open up the notebook to replace the factory-installed RAM. I don't think it is needed now.
In the anime Scrapped Princess, the Peacemakers requested for permission to use "first class divine punishment" attack in face of powerful foes. I was always fascinated by that — keeping your real power in check and only using a subset.
I wish I could do something like that now.
"Permission to use first class power".
But the real world isn't like that. It takes effort — real effort — to live up to any increase in "power".
And in the real world, unused superior power/skills do not automatically work better. Think of unused mechanical equipment. It takes time to run-in.
Some server-side code is needed, but it's transparent to the players.
My manager gave us three months to come up with interesting ways to process a list of 31,975 websites.
The response was better than what I expected. There were nine entries from fifteen engineers.
He then wanted us to rank one another's programs from 1 to 3 (using KPI, actually).
I ranked the creative entries pretty highly. It is not proportional to the effort put in. That's how "beauty" contests work.
My own entry wasn't very creative, but at least it looked decent: Trace Route. (Tested on IE 6, IE 7, FireFox 3, Opera 9.5 and Chrome beta.)
I stumbled on a pretty good article on memory: What every programmer should know about memory.
In these days of multi-GB memory, "infinite" virtual memory and automatic garbage memory collection, programmers have forgotten how to manage memory.
I googled "Programmer" and found this interesting writeup as the fourth link: How to be a programmer.
If you like The Pragmatic Programmer, you'll love this writeup. If you like this writeup, you should read the book too.
I got inspired to write about programming stuff after talking to a friend about the difference between programming in school and in the real world.
There are many things that irk me after a few years of programming in the real world. I will cover mostly about C, because it's very easy to trip yourself in it. I don't expect to add new entries very often, though.