Tag Archive: Windows


So, storm on Monday, there’s a power outage. Everything important is on a UPS, and though it did expire due to the length of the outage, everything seemed to survive and powered up ok.

I have a Netgear ReadyNAS Duo that I use for some storage and file backups, and normally my backup software burps, because it loses its connectivity to the ReadyNAS, and this seemed to be the case, so I did my usual tweaks to fix it, and walked away.

Unbeknownst to me, the usual tweaks hadn’t worked, because it wasn’t the usual problem! The ReadyNAS now won’t recognise either drive and I am running R-Linux for Windows to scan the drive and recover, and at the moment it looks hopeful, and I at least have a mirrored array to drop the files onto for now, but it’s a major pain in the rear!

R-Linux can be found here: R-Linux for anybody in the same boat, and I will update once I have the results, but having a wierdy file system is not one of the best features of the ReadyNAS Duo, it has to be said!

Update number 1

It’s looking good so far for R-Linux πŸ™‚

It found about the right amount of data in my ReadyNAS disk, and is currently in the process of restoring it for me.

TBH if certain files can’t be recovered, it really isn’t the end of the world, as long as the main stuff can be dragged back kicking and screaming!

I guess my next update will be tomorrow.

Update number 2

Happy days, all data recovered. I just need to work out whats up with the ReadyNAS now!

Update number 3

So, ReadyNAS is back on line, but a factory reset was required. I’m certain I have a config backup somewhere, but I’m thinking a clean fresh setup is the way to go, so best to just crack on. Don’t you just hate it when hardware crashes that badly! Damn storm.

On the positive side, it’ll give me an opportunity to have a clean up of all the crap I kept on there!!

So, I recently decided to retire my Exchange 2003 server, in favour of Exchange 2007.

There were a lot of reasons for this.

Firstly, my Windows 2003/Exchange 2003 server was an old HP DL380 G3. In itself, a great, reliable server, which cost me only Β£20 off Ebay, but getting on a bit, and very noisy in my home office, especially combined with the HP DL 360 G2 I was using for an Untangle server, and the newer Dell T105 which was already running Windows 2008 and Exchange 2007.

Secondly, at work we have Exchange 2007/2010, so I had a desire to expand my skills to better enable me to understand the environments I need to support.

Thirdly, I wanted to also migrate a number of sites I ran (OWA, WordPress) over to IIS 7 due to the improved security features, and ease of management.

And last, but by no means least, the two HP servers are making my electricity bill look like that of an actual data centre, so I wanted to retire them both, enjoy the quieter office, and save some pennies.

The migration from Exchange 2003 -> 2007 was really not so difficult, with the only issue being the move of my wife’s mailbox, which reported there may be activesync issues. It would be her account! She did have issues, so I had to re-do the account on both her iPhone and Galaxy Tab, but all was fine after that.

So why, I hear you asking, is it more secure, but more annoying?

Well, certain feature like tarpit are enabled by default, and also it’s more secure when trying to connect over telnet, and it has some excellent anti spam features built in.

Now comes the annoying bit. Many things can be done via the GUI, but if you want to get at all the really neat stuff, you have to visit the Exchange Powershell CLI. Yes, I said CLI.

For example, enabling the anti spam features requires running some scripts from the CLI, as does adding a list of custom blockwords to the anti spam component.

You can add blockwords via the GUI, but it takes FOREVER!!!!

Same goes for blocked email domains and senders.

CLI = power in the case of Exchange 2007 and beyond. I guess I’d better get used to it!!

OK, so I promised to do a post on how to run the Windows version of the Minecraft server as a psuedo service.

I don’t intend to get into the ins and outs of the server.properties file, see Google for that πŸ˜›

To get the bad news out of the way, whilst it does run in a service like mannner, it won’t shut down. You have to kill the javaw.exe or javaw.exe *32 process, then go and set the services to stopped in the services console (Services.msc if you want to run it manually).

It does however run GUI less and allows you to log off the server whilst it remains running. If you also specify different ports, you can have multiple ‘Services’ running and serving up MineCraft. I currently have 2 running happily side by side on slightly different ports.

Bad news out of the way, you can set it to auto start, and it works quite happily, restarting your MineCraft server after a reboot πŸ™‚

So, how do I do this chicanery you ask?

There are only two tools you need TBH, one of which is built into Windows 2003, Windows 2008 and Windows 7, and the other is freely downloadable as it’s part of the Windows 2003 Resource Kit.

Here’s the steps you need, based on my install which has the MineCraft server executable installed into a directory on the E: drive of my server called Minecraft. Don’t forget to replace e:\Minecraftt with your own directory.

1. Create a directory on E: (or whatever drive you’re using) called Minecraft.

2. Download Minecraft_Server.exe to the directory you created at step 1. Run it once to create the file and directory structures it needs.

3. Download srvany.exe to the directory you created in step1. Google it, you can’t fail to find it.

4. Open a command prompt and type the following, ensuring the space is present after the = sign: sc create Minecraft binpath= e:\Minecraft\srvany.exe

5. Hit enter. The base service is now created. Don’t worry if you stuff it up, just type: sc delete Minecraft and it’ll be gone πŸ™‚

6. A little work remains. Execute the comand regedt32.exe from the run box and locate HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Minecraft in the registry.

7. Right click on the Minecraft key, and select ‘New’ then ‘Key’. Name the new key Parameters.

8. Right click on the new created Parameters key, and select ‘New’ and ‘String Value’. Give the new string the value AppDirectory, and give it the value e:\Minecraft

9. Create another new string value called Application with the value e:\Minecraft\Minecraft_Server.exe

10. Create another new string value called AppParameters with the value server.properties

11. Finally execute services.msc from the run box. Locate the Minecraft service you just created and double click on it.

12. In the general tab set the startup type to be Automatic, and on the log on to be Local System Account.

13. Click ‘Start’ and off your Mincraft server goes. Just check for the presence of javaw.exe or javaw.exe *32 in Task Manager.

14. Log off your server, in the knowledge that Mincraft will continue to run whilst you’re away πŸ™‚

15: Go play Minecraft and connect to your server.

One last thing, is that if you do decide you want to run multiple servers, make sure you name the services you create differently i.e. Minecraft, Minecraft2 etc, and set the port in the server.properties file. You’ll also need to create a separate directory with s different containing all the Minecraft files, as you can only run one site per folder. I use e:\Minecraft and e:\Minecraft2.

If you’re having issues with the server.properties file not being read, make sure all the AppDirectory and AppParamters are set correctly. You can also add server.properties to the Start Parameters field of the service.

As I said, it is a sort of pseudo service, in that it doesn’t stop when you stop the service, so you have to kill the process, but it is better than having to stay logged on to your server.

One word of advice if you do end up running more than one server on the same box, is to make a note of the PID of one javaw process so you don’t close the wrong one down by accident.

An export of my Minecraft service reg key is below. You can modify the directory names to suit and import into your registry, just make sure you create the service first.

By the way, only do this if you’re confident you’re not going to stuff up your server, as despte having tested these instructions thoroughly, I can’t be held responsible for what you decide to do to your own server πŸ˜›

Windows Registry Editor Version 5.00

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Minecraft]
“Type”=dword:00000110
“Start”=dword:00000002
“ErrorControl”=dword:00000001
“ImagePath”=hex(2):65,00,3a,00,5c,00,6d,00,69,00,6e,00,65,00,63,00,72,00,61,00,\
Β  66,00,74,00,5c,00,73,00,72,00,76,00,61,00,6e,00,79,00,2e,00,65,00,78,00,65,\
Β  00,00,00
“DisplayName”=”Minecraft”
“ObjectName”=”LocalSystem”

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Minecraft\Parameters]
“Application”=”e:\\minecraft\\minecraft_server.exe”
“AppParameters”=”server.properties”
“AppDirectory”=”e:\\minecraft”

[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Minecraft\Enum]
“0”=”Root\\LEGACY_MINECRAFT\\0000″
“Count”=dword:00000001
“NextInstance”=dword:00000001

So, I had some spare time today, and having downloaded the Windows 8 preview, I thought I’d have a crack at installing it.

Tbh the process of installation was if anything, easier than Windows 7, even taking me through a seamless wifi setup to get online, even before I’d got to the login screen.

I wasn’t over enamoured with using my Hotmail credentials to log in, but I’m going to see how it works on a domain (if the preview works with domains), so that may be a non issue. That said, I could have used other credentials, but the Hotmail ones seemed most logical.

What really made me wrinkle my nose though, is the new Metro UI. My initial impressions are that whilst it may be great for a tablet, it isn’t so good for a traditional PC. To be fair, I need to play with it some more, and I will, but the UI has changed so much, that I have a nagging doubt that I’ll buy into it as my next desktop OS of choice.

Assuming that Metro can be turned off, then I may yet be persuaded, but that is something I’ll need to report back on.

I understand that Microsoft want to rationalise the look and feel of the OS across platforms, but I do wonder if this is a bridge too far.

After performing a number of configuration changes on my server to improve performance, I decided that I wanted to work on performance tuning of my IIS6 install since I host all of my own web sites.

One way of doing this is to enable compression, either globally, or per site. By doing this, content can be compressed by the IIS6 server before being sent, giving savings on bandwidth.

This does rely on support by the web browser being used, but since most do support compression nowadays, it’s well worth doing.

Since I don’t run many sites, I decided to enable compression globally. If you want to do it per site, then look here: Enabling HTTP Compression (IIS6).

So, my first stop was to create a new directory for the compressed files on my D: drive, since I also wanted to move the folder to a new location at the same time. TheΒ  default is at ‘C:\WINDOWS\IIS Temporary Compressed Files’, however I didn’t want the folder to be slap bang in the middle of my Windows directory. If you’re creating a new folder for the first time, then make sure the the IIS_WPG user has full control to the folder, and, if you have an identity for your app pool(s), then they will also need full control.

I then opened up IIS Manager, found the folder marked ‘Web Sites’, that contains all my sites, and right clicked on it. I then selected ‘Properties’, and once the dialog box had opened up, located and selected the ‘Service’ tab.

I then checked the boxes ‘Compress Application Files’ (Dynamic compression), and the box marked ‘Compress Static files’.

I also checked the box to restrict the directory size, and chose to restrict it to 100Mb, after which it will clean out the oldest files.

I clicked ‘Apply’, then ‘Ok’, and that was it, compression enabled!

PHP and IIS have not always got on. My last install was hand crafted, and took some time to get the way I wanted, with the extensions I needed etc.

So, I had the need to install PHP for a customer at work, on what was a new, effectively bare install of IIS, and things have improved somewhat.

After a bit of research, I discovered I needed the following:

1. PHP for Windows, found here: PHP for Windows. Recommended was version 5.3.3 VC9 non-thread safe installer. You may also need the Microsoft 2008 C++ Runtime, which is linked to from the same page.

2. Fast CGI for IIS, found here: Fast CGI for IIS. Follow the on screen instructions to install this.

Since it was a clean install, there were no previous versions involved, I simply installed the Microsoft 2008 C++ Runtime which was appropriate for my OS, installed FastCGI, and finally, ran the PHP 5.3.3 installer, selecting FastCGI from the list presented. I also left the installed extensions as the defaults, however you can opt to install more extensions if you so desire.

By selecting FastCGI, the installer will automatically add the .php extension globally to IIS, so any new sites you create will automatically get the .php extension.

Reboot if you need to, and once the server has restarted, create a file in the root of your default website called ‘phpinfo.php’, and edit it so it contains the following:

<?php
phpinfo();
?>

Using a web browser, visit the default site and specify the phpinfo.php file. If PHP is working, this will display several pages of information about PHP. If not, you’ll need to retrace your steps to see what might have gone wrong.

The most usual problem is that the site does not know what to do with the .php extension, so just check and make sure it matches the global one, and if it doesn’t exist for your site, you can add it using the global definition as a reference.

If you are upgrading, the procedure is much the same, except you must uninstall any previous versions before starting. This will mean your sites are down, so be sure to do it when it will have the least impact. I also took a backup of my old PHP directory as a reference before uninstalling

Also, although the global settings for the .php extension will be updated, you will need to check all your PHP enabled sites to ensure the .php settings match the global settings, or your sites may not work correctly.

That’s it. Not too difficult, just take your time and don’t panic!

The last item I had to move to my D: drive was my web root folder that contained all of my websites.

I moved all but the default website, since that contained my Outlook Web Access, and I was wary of breaking it. As with any procedure, make sure you have good backups, and since this is an IIS 6 procedure, ensure you are running IIS 6 before using it.

The actual procedure is quite simple, since we are simply moving directories, and pointing the websites at the new location. Before proceeding, ensure you’ve checked each site is working, and that you know the location of the folder that contains the sites files.

First things first, start a command prompt, and run the command ‘iisreset /stop’. This will stop all services relating to IIS, and means there will be no locked files to hinder you.

Once the command has completed, locate the folder containing the files for your website. For example, the folder containing the files for this blog were at C:\web\http\Wordpress.

I then copied the entire WordPress folder and its contents to D:\web\http\. The folder was now D:\web\http\Wordpress.

I then made sure the permissions matched those of the folder in it’s original location. i then restarted IIS from the command prompt, using the command ‘iisreset /start’.

I then went to the properties of the website in IIS Manager, and at the home directory tab, changed the path to point at the new location, and restarted just that site.

I then tested the site to make sure all was well before deleting the files from their old locations.

I then repeated for all my other sites. If you want, you can issue the command ‘iisreset /restart’ when you’re done, and this will restart the whole if IIS, just to be on the safe side.

So, next on my list of ‘Things to move to D:’, were the MySQL DBs that sit behind a couple of sites I’m tinkering with, plus the DB that sits behind this very website. I’m using MySQL 5.1, but if you’re using a different version, please check before using this procedure, and as always, if you can, take a backup.

The MySQL procedure is quite different to that used for MSSQL, as you will see.

Firstly, stop the MySQL service, because you won’t be able to move a thing if you don’t.

Once the service is stopped, locate the MySQL program directory, which was C:\Program Files\MySQL\MySQL Server 5.1 for my install.

In that directory, locate the file called ‘my.ini’ and open it in notepad. Look for the comment ‘#Path to the database root’ in my.ini, and the very next line should contain the current path to all your DBs.

Locate your DB’s using the path you’ve just found. Copy all of the folders to your new drive, in my case D:\MySQL, and make sure they have the same permissions as before.

Now, go back to ‘my.ini’ and comment out the line beginning ‘datadir=’ that pointed to your original DB path. The # is used to comment out lines in MySQL.

Create a new line beneath it, beginning ‘datadir=’ and add your new path to your DBs.

For example, the line I added was datadir=”D:/MySQL/”. Save the ‘my.ini’ file, and restart MySQL.

That should be it. Check all your DBs are working, and if not, roll back the changes and look at the event logs to see what might be wrong.

Once you’re satisfied everything is working, you can delete the old directory and free up some space.

Good luck!

So, having moved my Exchange databases and logs to the new D: drive on my server, the next on my list was my MS SQL databases. NOTE: I’m using MSSQL 2005, so whilst I know this will work for that, if you have a different product, check for the product you’re using.

As with my Exchange maintenance, I’d already created a new directory structure on my D: drive, with identical permissions to the folders containing the original databases.

Before starting this process, I stopped all of the services that were using SQL to ensure the databases were not being accessed when I tried to move them. I also made sure I had a good backup before attempting the process. You don’t need to stop MSSQL to do this, and in fact if you do, you won’t be able to access your DBs to detach/attach them.

I then started up the SQL Server Management Studio, located the DB I wanted to move, right clicked on it and selected ‘Detach’.

At this point, the DB will be detached, and will be removed from the list of DBs.

At this point, go to the folder(s) containing the mdf (Database) and ldf (Log) files of the database you have just detached, and move them to the new folders you created earlier. In my case, the mdf file was moved to D:\MSSQL\DB, and the ldf file was moved to D:\MSSQL\Logs.

The next step was to go back to SQL Server Management Studio, and right click on the ‘Databases’ folder. Select the attach option and a new dialog will open.

Click the ‘Add’ button, and you’ll then get a file browser, that will let you find your mdf file in it’s new location. Select your mdf file, and hit ‘OK’. At this point you’ll probably get a warning that the log file can’t be found, and you’ll have the option to locate it. Locate your ldf file in its new location and hit ‘Ok’ again and your DB should now re-attach.

I then repeated the process for my other DBs and then restarted all my services and web sites and tested.

The detach/attach method is not the only way to move DBs around, and there’s a good resource here: Moving SQL Databases, however I found it to be a pain free way to move my DBs to my new drive.

Since I now have a server with 2 drives instead of 1, it makes sense to move my Exchange data stores over to the new drive, getting them off my O/S drive and taking advantage of the extra spindles and free space. NOTE: I’m doing this with Exchange 2003, so if you have a different version, it may not be suitable, so check first.

It is actually a pretty easy task, with the Exchange System Manager (ESM) doing all the hard work for you.

I’ll be moving the transaction logs, mailbox store and the Public folders.

To do this, start off by firing up the ESM and locating the storage group. Mine was called ‘First Storage Group’, which is the default, but yours may be called something else.

In ESM, right click on ‘First Storage Group’ and select ‘Properties’. There will be a tab labelled ‘General’. Click the ‘Browse’ button, and select the new location. In my case, I had pre-created all the folders I needed under a folder in the root of my new D: drive, so I pointed it to D:\Exchange\Logs. I also pointed the system path here in the same manner so it would use this directory for temporary Ecxhange files.

Click ‘Apply’ and Exchange will make the necessary changes, and that’s the transaction logs moved.

Next, right click on the ‘Mailbox Store’, and select properties. Once the dialog has opened, select the ‘Database’ tab. In the same manner as for the transaction logs, use the ‘Browse’ button to point to the folder where you want the database to be relocated to. You can move just the Exchange database if you want, or move the Exchange Streaming database too. I did both, and pointed at the folders I’d created before starting on my D: drive.

When you hit ‘Apply’, you will (Unless you’ve already dismounted the data store) receive a warning that the datastore will be dismounted. Anybody using Exchange will be disconnected at this point, so be sure to do this when it’s not going to disturb a lot of people.

Click ‘Yes’ to continue, and Exchange will move your data store to it’s new home. Once complete, you’ll need to go in to ESM and remount the store.

Do the same for Public folders if you wish, the procedure is identical, and again the store will need to be remounted afterwards.

Once completed, ensure your new folders have the following permissions to ensure everything works correctly:

Administrators: Full Control Authenticated Users: Read and Execute, List Folder Contents, Read Creator Owner: None Server Operators: Modify, Read and Execute, List Folder Contents, Read, Write System: Full Control.

Although I chose not to delete the old MDBdata directory, you can do so, but you will need to do some registry editing. I recommend this article: Moving Exchange Data Store to a new disk as it has some useful links to aid in troubleshooting this kind of operation.

So, Exchange is now all moved to my D: drive, and so far so good. If I hit any errors, I’ll post here.

Now my Windows installation has been moved to its new hardware home, in order to improve perfomance, I decided to do the following:

1. Move my Exchange information stores to the new second drive.

2. Move my MSSQL databases to the new second drive.

3. Move my MySQL databases to the new second drive.

4. Move my IIS6 website content to the new second drive.

This should improve overall performance since these items will no longer reside on the same drive as the O/S, and more spindles means better read/write performance.

Since each one of these tasks is not that difficult, I thought I’d share with the world! So keep an eye open, they are coming soon.

My first ‘proper’ server, was a Compaq 1600R, that I got off eBay for the princely sum of Β£25.

Now as I started to do more with my server, I found the hardware just wasn’t up to the job.

It could only take 1GB ram max, and had a pair of Pentium II cpu’s at 400 Mhz.

So, I hunted around on eBay, and got myself an HP DL360 G2, dual 1.4Ghz server with 3GB ram for the outrageous price of Β£40!

Of course this meant I had to move my existing install off my 1600R and on to my DL360. Since it was a DC, Exchange, SQL and webserver by now, I really needed the upgrade!

I found this article by Microsoft: Migrate to new hardware to be a great help in this matter.

I have now also just upgraded again, to a DL380 G3, had for just Β£20 (It arrived a bit bent, I straightened it out, but the seller refunded me Β£20!!)

Yet again, the procedure outlined in the MS document proved to be most helpful, and I have moved yet again successfully.

What is most surprising, is that the process uses NTBackup. Yes, you heard right.

I’ve tried to do the same procedure with BackupExec 12, and failed miserably, yet NTBackup did this flawlessly.

Just make sure you read the parts pertinent to your particular installation, but overall a great help and well worth a read.

As a follow up to my previous post, there is another issue that I came across. Bit of a gotcha.

Because we were upgraded to Premium SSL certs, we didn’t have the newest intermediate CA certificate on our servers, so whilst the certificate was valid, Firefox users experienced issues because the certificate did not appear to be valid.

It was an oversight really, because we’ve not really used Premium certs, there was no need to update the intermediate certificate.

Well, I have updated it now, and all is well, so I am a happy bunny again.

SSL Hell!

I do sometimes wonder why something that worked perfectly well the last time I did it, now doesn’t and has caused me a morning’s worth of worry.

The problem? A simple SSL certificate renewal. Or not so simple, as it turned out.

Usually, we don’t do certificate renewals, because IIS6 seems to have a problem, meaning it doesn’t work properly. Instead, we use a temporary site to generate a CSR, which we then use to generate our new certificate. Once we have the cert, we finish off the outstanding certificate request.

From there, we export it from certificate manager, and import it to where it is needed, usually by double clicking on the pfx file and running through the on screen prompts.

Next we do a simple certificate replace on our three webservers within our cluster.

The problem is, is that during the export from our original server, the private key got munched somehow, which we didn’t know, so once we deployed the cert on our live servers, the site broke quite badly. Despite the broken certificate, it

I was able to find the problem after I installed and ran Microsoft’s SSL Diagnostic tool, which I’d highly recommend if you find yourself in a similar position.

I fixed it in a roundabout way, though I’m not completely sure why it worked.

First, I went back to the server where the original CSR was generated. I then went to the temporary site, and exported the cert using IIS6, and not the certificate management console.

I then copied the resulting PFX file to the destination, and instead of double clicking to import, I opened up certificate manager, selected my certificate store, then right clicked on the store, and selected the ‘Import’ option instead.

Once imported, I did a simple SSL cert replace on the problem site and all was well again.

I’m frustrated that the original method used to work, and that it now, seemingly, doesn’t, but at least having had this happen, if it does happen again, I won’t be so much in the dark.