A common practice in modern development is the nature of how we develop. We can usually only choose one of the following two choices when programming: Speed or Flexibility. As programmers, we know that our systems, and our client systems are capable of billions of operations per second, have nearly unlimited hard drive space, and billions of bytes of ram available. Hardware is cheap and easy to come by these days.
So our most natural choice is to select Flexibility for our systems. As long as we are not ridiculously wasteful with our architecture, the modern machine should be able to handle it.
Think again…
Show of hands, how many of us, on our super fast computers, with a plethora of available resources, have been slowed down by unresponsiveness; had to clear out applications from auto-loading and been disappointed with the overall performance of your entire operating system? My hand is raised.
As amazing as our machines really are, it is more amazing that we are so polluting and wasteful as programmers, that we bring these work stations to their knees.
But my software won’t have that much impact, why should I be concerned?...
Each other developer might say the same thing. Let us take a look at Microsoft Word. I just started a blank copy on my computer, started task manager and found it takes 40 megs of ram and 24 megs of virtual memory. 64 megs of ram… Remember, the most common use of this is just a glorified Text Editor. Given that Word took 8 megs of ram in 2000, lets consider that. 7 years, it went from 8 megs to 64 megs. If I round this up to an even 8 years, and put a binary jump on it, (8, 16, 32, 64) that tells us that it doubles in ram size every 2 years on average. Next, considering that Microsoft never rebuilds Word from scratch, let’s take a look at the next 8 years: (128, 256, 512, 1024) It will take 1 gig for a single instance of word to be open. 1 gig.
Naturally software will be limited by hardware, but let’s step back for a moment. Is our software really limited by our hardware? Or is it our bad coding practices? What on earth is Microsoft word doing that requires 64 megs of ram? Is it because it is so loaded with features that it needs to take up that much space? No. (I’m sure Microsoft would argue otherwise)
So what can be done to fix things? How do we make it better?
1st thing I recommend is start programming with plug-in systems. There is a plethora of free and commercial plug-in tools and source available on the internet. A quick idea is that you split menus from the plug-ins they represent. Once someone clicks on something, let that trigger the plug-in being loaded. Then if different plug-ins work together, let the plug in system load any other plug-in on first request.
If the system has a lot of resources available, have it load all the plug ins on another thread. So they are all available. If the system starts running low on resources, start killing unused ones. This is something Microsoft should start pushing in .NET. A plug-in and announcement system over the OS. Then in the control panel, the users can set modes, like Always pre-load everything, pre-load as little as possible, or variances. Of course that would mean Microsoft would have to start becoming an example of that, and I don’t really see them taking that path. (although Server 2008 is a pleasant counter statement on their part, with its far lighter footprint). Of course you can also include this part in your own settings/preferences area, and install software.
2nd thing I would recommend is start considering better code practices. Most of us don’t like repeating code. In the past, I have created loops that did something to every element in an array, while identifying 1 that it finds with a particular value. I did this with a single loop, and placed an if statement inside. While that is effective, and won’t hurt much, I could have had that if statement surround a second loop doing everything the first did, excluding the if. The result would have gotten rid of the if for every increment after it wasn’t needed. While this is a limited example, it will hopefully open your eyes to possibility’s of how and where you can improve your code. Loops are great places to start.
3rd, for Web or Network applications, consider using Cloud Architecture for supporting it. Terra Cotta is a great farm method to get rid of the need for databases.
4th, Load test your applications.
5th, do several proof of concept tests, on areas that seem like they might be risk on resource usage. Not complete applications, but minimal comparisons of different approaches to the same need. Doing tests on each one.
6th, If your system is complicated, and you are not sure how a user might use a section most commonly, consider putting in resource counters in a demo app, to count and report what areas, and focuses are most commonly used, then select the proof that most works with that, and change it out to improve the performance.
7th, Stop using windows registry. There are simple ways to provide the same functionality, and the windows registry is getting huge.
8th, use Late binding in any code that might exit prematurely.
9th, Do code reviews, with others who can look at it with speed in mind.
10th, Re-consider the requirements. If a small change to an obscure requirement could dramatically improve performance, bring it up. Given knowledge of the impact it could have on the performance of the finished product, the deciding parties will most likely choose faster.
11th, if it is so complicated that you are barely getting through it, Once done, trash what you have and re-write the section. You’ll know it better this time, it will make more sense, and you will most likely be able to see better methods now that you understand it better.
It’s easy to go overboard in either direction, so all in all, keep a balance, and never forget that resources might not be as plentiful as you think.
No comments:
Post a Comment