What happened to this blog?

This blog has been down for about four months.


The blog was hacked, the protection I had in place wasn’t strong enough it
would seem.

My bad, I was told that Windows Server 2003 was vulnerable especially with an open blog (open as in readers could post comments) and the blog paid the price.

To remedy the problem I had to move the www.diy-computer-repair.net web site and this blog to a newer Operating System that had the protection, namely a add on called WebDav where I can specify who can or can not alter web pages.

Why did it take so long?

The www.diy-computer-repair.net uses what the programmers call “includes”, this is a special file with formatting used to place common lines of text or images in certain places on each page, this gives the web site a continuity in the appearance, such as the logo image is in the same place on each page, or say the menu navigation on the top or bottom of each page.

When I decided to go from Windows Server 2003 (which I am quite familiar
with) to Windows Server 2008 or 2012 I knew I would have a steep learning curve. These newer (relatively speaking, the newest is Server 2019) operating systems are closely related to Windows 7 and Windows 8.

Having used Windows 7 for a about six years, knowing that I don’t like the file system or a few other things I would have to figure out how to make the newer version of IIS (Internet Information Services) do what I needed for the www.diy-computer-repair.net web site to function as it did with the older IIS 6.

I picked Windows Server 2008 R2 because it was the closest to Windows Server 2003 and Windows 7. However I will learn that the programmers have moved from the old way of doing things to a new way, the improvements are not with out hassle nor are they easy to use.

So for three or more long months I worked on getting the web site to work, one of the things that caused me the most heartburn, frustration, and irritation  was the move of a programming technique called SSI (Server Side Included) with IIS 6 and below this was an added function to the installation of the service. That is you could choose to use it on not use it, if you added it to the configuration then all you had to do is enable or “Allow” it to work. If you used one of the commands in a web page the service would read the command when the page was requested by a browser and then format the page with the command

The include command is a formatted text inserted in the document when the creator writes it, the command normally calls another page or snippet of formatting and places it in the page, thus every page has the same logo, menu’s, and other information on each page with out the creator manually adding it to each page. If say I need to change something on a menu (like a new page or category) then I change one page that will show up on each page instead of going through over 800 pages to make the change.

This SSI function is disabled in both IIS 6 and the newer versions of IIS, I am using IIS 7.5, so I had to figure out why the included on each page would not function and fill out the page with the formatting you see when you read the page.

For IIS 6 the SSI worked with out any additional programming or “scripts”, however IIS 7+ didn’t work. So I had to do a bit, no make that a lot, of research to find out why this command: #include would not work on IIS 7+.

There are a lot of opinions on the internet; there are a lot of solutions also. However none of them worked, I took each opinion/solution and tried it on the IIS installation, some of them would cause other problems, some caused the IIS to stop working totally.

After the first failure I decided to not reinstall the OS after each attempt at getting the #include instruction to work. I made an image of the OS partition (using a Multiboot setup saved me a few hours of extra work when the OS crashed!) and would over write the failed OS with the original install.

One day I was searching on why the #include would not run when I saw a small blurb (the text in the search return) that said the SSI would not run a #include on a .html formatted page it had to be a .shtml so I clicked on the link to read what the author had to say.

Basically the programmers took the easy/lazy way out and left out the coding in the IIS program that looked through the text of the requested document that the browser wanted displayed for the reader. This coding from the IIS 6 version would read through the page and when the SSI was activated by the #include (and other commands) it would get the document by the name such as doc2.shtml or would format the command as requested. The code was left out or changed and now the only way the #include will work is if the complete page is formatted with the .shtml extension instead of the normal .html.

So to test this theory I wrote a test page with only the #include then put it on the web site and browsed it, the SSI worked! WOW!

Ok, that was great, however that left me with two options:

1. Move all the #include statements to another programming language such as .asp or java scripts (neither are in my programming tool box…)

2. Rename all the pages on the web site from .html to .shtml and go through all the pages and change each link to the old .html page to the new .shtml page.

I took the easy way out; I renamed all the pages to the extension .shtml, then using a old program I am quite familiar with to change all the links.

All of this took time, a lot of it… Good thing I am retired, eh?

Thus ends the saga of what happed to the blog:

I could have put the www.diy-computer-repair.net web site on the new server and then worked on the SSI non function while the site was live, however that is not how I work nor would that have been a good idea, so I left the old Windows 2003 Server with IIS 6 running until I figured out how to make the Windows 2008 R2 IIS 7+ work properly for the SSI, so if you see any broken links or problems use the blog to send me a message by comment.

Thanks for bearing with me on the missing blog…

Posted in Security | 1 Comment

Can you really speed up your internet connect speed?

Back in the day of Dial Up internet connections everyone wanted to get the most out of their connect speed.

Then came cable and DSL or ‘Broadband’ internet connections.

Something to think about when you are considering how to optimize your connect speed.

Your connect speed is determined by the ISP, the media, and the modem/router.

Lets take that apart –

The first part of the equation is the media. By media I mean the wire or cable that connects your router to the main switch at your ISP. The fastest or least loss type of media is fiber optic. Then you have copper.

Most newer cable installations are fiber optic but the connection from the fiber optic cable to your modem will be copper. Over short distances (less than 100 feet) copper is not a limiting factor. So your connect speed will be what the ISP advertises, if they say 10 Mega Bit then you can test your connect speed and you will be very close to that speed for downloading a large file.

However if you live or work in an older part of your city then more than likely your media is copper and in some cases the very old copper. The copper is limited by what is know as impedance, that is resistance to the flow of electrons.

If the ‘central office’ is over a certain distance from your location then the speed will start to degrade. One way around this is for the ISP to install ‘booster’ or amplifier devices at certain distances from the central office. This booster will refresh the data as it is transmitted further down the wire.

That was one of the reasons that dial up modem connections were limited to a max of 14400 Kilo Bits Per Second. There were not any booster or amplifier devices between your location and the central office and it was an analog device not digital.

So we come to the next part of the equation – the ISP

You have no control on how fast the ISP connects to the back bone of the internet in your area. Most how ever are connecting in the 20-30 Giga a Byte range because the ISP carries the load of all their customers on those connections.

The last part of the equation is the modem/router.

Most cable/DSL modems are also a router and the maximum connect speed of these modem/routers are hard wired to 100 Mega Bits Per Second, however  newer modem/routers on the market that are the Giga Bit Per Second type.

(Note: All connect speeds are rated in BITS Per Second not BYTES per second!)

If your modem/router is of the 100 MBPS type then the best you are going to get is 384 Kilo Bytes Per Second (take the 384 and multiply it by 8, what is the number? that is your MBPS) download speed. If you have a 1 or 5 Giga Bits PS type modem/router your will get from 780 Mega Bits PS to 2 G Bits PS download speed. Your upload speed will always be lower than your download speed.

The meat of the matter is that you can only do two things to optimize your connect speed and that is to insure that the computer to modem/router is set for either 100 MBPS and Full Duplex or 1 GBPS and Full Duplex.

The other thing you can do is set the MTU (Maximum Transmit Unit) to 1500 (this is the maximum size of each packet that will be transmitted).

There are some old (and there will be some new ones) programs that advertised ‘Speed up your Dial Up Speed’ that were scams. So in the near future you will see scams that will advertise ‘Speed up your Broadband Connection’, save your money. Try to download only compressed files.

A little research will save you a lot of money.

There are a few programs you can use to check your network speed on your computer, I use DU Meter by Hagel Technologies, externally you can use a web based program to test download and up load speed, do a search for them. I use both to test when I feel my connect speed has degraded before I make changes or call my ISP support to find out why the speed has changed

Go ahead, post your input as a comment below. I’ll be reading every single one.

Posted in Hardware | Tagged , , , , | Comments Off on Can you really speed up your internet connect speed?

Computer Cases

Back when I first started www.diy-computer-repair.com I had a catalog for a while, I wanted to help the readers of my web site by having high quality products available for them to review or buy.

It didn’t work out, too many other web sites sell computer products.

One of the things that surprised me was the number and variety of the computer cases that were available ( I was in the corporate world for almost 20 years!) from tiny 6×6 inch desktops to monster towers that looked like half a desktop.

Whoa! You could spend a small fortune on those things!

In reality your computer case should have functionality, it should be easy to open and remove or replace devices, have room for all your computer parts, and room for expansion.

Sometimes the owner wants it to be aesthetic, match the drapes or the carpet. Or have glowing and flashing lights.

With plastic and metal and if the cost is not a factor all things are possible.

I visit a forum for overclockers frequently and the more adventurous and ingenious people make cases out of every thing from card board to one that had an A/C duct in it (it cooled the computer to -14* C!)

And it just isn’t desktops that have the mild to the wild cases. Laptops and netbooks are also in on the trend (fad?) from any color you desire to murals on the top. (I saw one that had a Mac screen on the top, looked like the computer was on, until it was opened then it was upside down, cool – for a Mac).

The main thing with a case is the size (will it fit where you want to put it?), how easy it is to open and get inside to work on things like the motherboard, etc, how easy it is to clean, and will all the stuff you want to put in it fit.

Back in the 90’s Packard Bell was making computers. Their case designs were the worst I ever saw, hard to open (sometimes I had to pry the case apart with a screw driver) and some of the desktop case covers were hinged. the cover didn’t open more than 45 degrees, the hinge was on the back of the case, it made getting cards out of the slots a challenge.

I built a computer for a guy back when drives were in the hundreds of Megs in capacity. The case he bought (didn’t ask my opinion before buying it) would only hold two hard drives and one CD ROM drive. After I pointed this out (he had five hard drives and two CD ROM drives) he took it back and exchanged it for a bigger case. I liked the case he bought so well I bought two of them, they are still in use today, sixteen years later.

Did you look at your case? Getting a little long in the tooth? Hard to clean? Need a crow bar to open it?

Not enough room for all your computer stuff?

Share your experiences with the rest of us.

Go ahead, post your input as a comment below. I’ll be reading every single one.

Posted in Hardware | Tagged , , , | Comments Off on Computer Cases

Large hard drives and the case for defragmenting ..

With newer hard drive surpassing the terabyte size you can store more data, cool, no?

Maybe, maybe not.

When you have a large amount of storage you tend to keep everything even if it is no longer needed or was junk when you collected it.

Then one day you decide to ‘clean house’ and start deleting all that ‘junk’ on your hard drive.

This is where using MS Defragment program vs. a third party comes in to play:

When you defragment your hard drive with a third party program it may or may not defrag the Master File Table or MFT.

Why defrag the MFT? Because when you delete a large amount of data the MFT has ‘holes’ where the deleted file object data was stored.

File object data has:

  • Name of file
  • Extension of file
  • Where the file was stored on the hard drive, the starting and ending sectors, if it was fragmented when it was written to the drive then all the connecting start and end sectors are there also.
  • A table entry that contains additional data such as
    •  The author of the file,
    •  Attributes for what program opens the file, date of creation, date of last modified.
    • In all there can be over 20 different entries for data about the file.

The best way to defrag the MFT is to use the built in MS Defragment program listed under either the Tools tab of the properties page or the Defragment section of the Computer Management console.

The MS Defragment program will remove all these ‘holes’ where the data of deleted files were and compress the MFT. This decreases the time it takes for say Explorer or your favorite file management program to read the MFT.

It also makes accessing your files faster.

If you use a third party defragment program and the access time to open a file didn’t change chances are the program can not defragment the MFT, try the built in defragment program provided with the Operating System.


To answer a question about being forced in to moving to a lager hard drive by a collection of video’s and images. It isn’t the number of files that you have it is the size that counts. With NTFS there is no limit to the entries in the Master File Table (MFT), the only limit is the size of the volume, when it is full it is full.

You should make folders vs. partitions because if you fill up the root of a volume with files it makes the file system work harder to find and list all the files in the root of a volume. Where as having a file system such as a filing cabinet with folders in each drawer the indexing of the files will take less time and follow a logical pattern. (Does this make sense?).

Go ahead, post your input as a comment below. I’ll be reading every single one.

Posted in Hardware, Operating Systems | Tagged , , , , | Comments Off on Large hard drives and the case for defragmenting ..

What is your computer repair strategy?

Do you even need a computer repair strategy?

As a computer owner, are your Reactive or Proactive when your computer has a problem?

On a side note: In the IT (Information Technology) world computers don’t have problems they have Issues. (Something I think everyone should know).

Reactive strategy: Your computer has an Issue (problem) where a device (hardware) has failed. You immediately call your local repair shop, or bug the IT Department at work, or call a knowledgeable friend/cousin/nephew to fix the problem. You don’t know what a back up is and if you do it is about two years old. You may will loose all your data.

Proactive strategy: You have done some research in to do it yourself computer repair. You have the hand tools you would need to remove and replace a failed device. You have acquired the necessary software (see IT Tool Box below) to assist in fixing software problems with the Operating System or the installed software. You have also read up on (and maybe have the books) what makes a computer work and can identify most of the parts in the computer. If it fails or has an Issue you can either fix it or find the correct procedure to fix it with in a few minutes.

A reactive owner will spend a lot of time trying to find someone to fix their Issue, and then spend more time and money having it fixed. (This is ok because they help me pay my mortgage every month and I am planning to buy a Corvette soon).

A proactive owner says: Let it happen, I will fix it! (No worries, and there goes the Corvette).

What does a proactive computer owner need that the reactive owner would not prepare for?

First is knowledge, finding where the advice/information is located, being able to use that knowledge when needed.

A few sources of this knowledge are:

  1. The IT Department at work
  2. A library
  3. The internet
  4. DIY manuals

One of the problems with using a library or the internet is the jargon or as I like to call it Geekese. Being able to decode what the author of the repair article has written is time consuming at best, next to impossible for the most people.

How do you over come this knowledge barrier?

You could spend a few months at the local college and take some computer science classes. This would help a little but if you really wanted to know what these geeks were talking about you would have to join one of their cliques or hang out at one of their haunts like a forum on line or the local coffee shop (they still go there don’t they?).

Of course you could bug your companies IT Department every time you had a problem, err Issue but that would get old in short time.

So you are down to the last resort: Do It Yourself manuals.

These manuals range from the beginner to the expert. From the Dummies series to the Technical Manual that engineers use.

Which one is right for you?

One of the problems with the library and the internet is that the books and articles are written for people that understand the subject matter. That is it is written by a person knowledgeable in the computer field and will for the most part use computer jargon and nomenclature for the text. (Geekese for Geeks, I just did it in this paragraph.)

A Proactive owner would have to find repair manuals written in everyday language by someone that has the time and inclination to translate the Geekese to plain language. That is not an easy task but it has been done a few times.

For those that are just getting a computer then a beginner’s series of books would be a good starting point.

For those that are some what computer literate and feel the need to save some time and money there are Do It Yourself books, e-books, e-courses, and checklists available. (An e-book is a file that can be read on any type of computer or document reader such as a Kindle, iPad, or Nook)

As a retired Systems Admin with over 25 years IT experience I have written a series of Do It Yourself books, e-books, e-courses, and checklists for anyone wanting to do the Proactive DIY Computer Repairs, already translated from the normal computer jargon in to everyday English.

To see a list of the necessary software this page lists all the tools you would need in your IT Tool Box:.

If your computer repair strategy is to save time and money then the only option is to DIY!

Go ahead, post your input as a comment below. I’ll be reading every single one.

Posted in Hardware, Operating Systems | Tagged , , , , | Comments Off on What is your computer repair strategy?