Terminal Server and Data server?

Kenazo

Lifer
Sep 15, 2000
10,429
1
81
Currently we have a Windows 2003 terminal server set up which all of our users (25 accountants) connect to. All of our data is located on a 2nd computer connected through a gigabit connection to the terminal server. Is this good practice? To me it seems like it's just asking for slow downs.

How else would you handle something like this? Both computers are running RAID 5 SCSI arrays. Would you just have the terminal server computer have a second RAID controller and have the data array connected directly?

 

wlee

Senior member
Oct 10, 1999
585
0
71
When we tried that it was slower to keep data on another machine ( in our case an NAS )I'm *ASS*uming that you have daily backups of both machines ? If so, I don't know that you're gaining any advantage by keeping data only on the second machine. ( unless it's also serving other non- TS clients )
25 users seems a bit much for a single machine. Have you considered putting terminal services on the second machine and clustering, or splitting half your users to it ?
 

Kenazo

Lifer
Sep 15, 2000
10,429
1
81
We have a 3rd server that runs Windows 2003x64 with Exchange 2003 along with a 200/400GB tape backup using Symantec's backup exec.

We really have no non-TS clients... well we have one or two people that run 'one-off' programs that we didn't want to install on the TS computer, but this is quite minimal.

As for 25 users being too much, we have been noticing some pretty bad slow downs from time to time, like if Mcaffee starts doing its thing, but for the most part the computer can keep up (Dual Xeon 2400's, 4 gigs ram, RAID 5 ultra320 SCSI array, etc).

How does clustering work? Do we need a whole new set of Windows 2003 licenses and CAL's, or can we 'split' our current software over two computers some how? Do the computers need to be matched? How does the set up between the two computers 'sync'? Would this be a better solution than just upgrading the hardware running our current TS rig? Though I guess being a 32bit version of Windows we're stuck maxing @ 4GB of ram in the current set up.

What is the best practice for the data? Should it be on a 2nd computer? Is there enough fault tolerance in a gigabit connection for this to even be a better solution than dual SCSI arrays on our TS computer?

So, here's our current set up:

1. TS computer - Windows 2003 TS running on Dual Xeon 2400's, 4 gigs ram, RAID 5 SCSI array, This is our domain controller.
2. Exchange computer - Windows 2003x64 running Exchange and running our backups and the Protection Pilot for macafee. It has a SATA RAID 5 array.
3. Data server - RH 9, mirrored SCSI drives running on some crappy old IBM server (Cap's are bulging, so we're actually looking to replace this computer anyway)
4. TS users connect from P4 2-3ghz era Windows XP machines.



 

wlee

Senior member
Oct 10, 1999
585
0
71
You should change the schedualed virus scanning to a time when no one is logged in. That will help with the slow downs. You actually can have over 4 GB Ram on Server 2k3 ( as long as the MotherBoard supports it ), but it's still not as good as running 64bit. I've never had identical hardware to test side by side, but subjectively, Disk and Network I/O are much snappier on 64bit Server.

Well, I was *ASS*uming that the data server was a similar class machine also running Server 2k3. In that case, you would need a new 2k3 license in addition to a cluster license. The TS Client and user/device licenses should be portable as long as they are "per user" and not "per server". But none of that matters if it has bulging caps, I would get rid of it ASAP. Any safety you *THINK* you have because of the RAID 1 is negated by an unstable motherboard. We tossed a couple of machines that had the bulging/leaking caps. They kept having random blue screens and ended up corrupting data. I was also thinking that you had Thin Client stations. From what it sounds like, you originally had that IBM server as your PDC/file server and everyone ran their apps on the local machine. I'm guessing you went with TS to reduce the maintenance of the local machines ( hot-fixes, etc. ) and/or to control trouble maker users ? If not, those workstations are actually still of a useable class machine as long as they have enough memory. You're in a good positon to run in a hybrid env. if you upgrade or buy a new machine with Server 2008. 2k8 supports a new mode in TS that works like Citrix Presentation mode. It allows you to serve up a single app(s) to a user in a window with no desktop. To the user it looks like the app is running local, but in fact all code execution is on the server. The advantage to this model is that graphics heavy apps that don't work well in TS can be run local. If you've ever tried to watch an MP4 video or use a photo editor on TS, you'll know what I mean.
You could still keep the users home directory on the TS/DC. That model would still make management and maintenance of your "server" apps easier.


Back to the root of your original question... :)
I would move all the Data back to the Terminal Server and add more storage space if needed. It will be faster. Just make sure you stay ontop of backups !
 

Kenazo

Lifer
Sep 15, 2000
10,429
1
81
DrGreen - They don't run as admin users, if that helps. :) I didn't set it up, I just want to clean it up. I'm suspecting the better option would be to have our Exchange computer be the domain controller.

wlee - Your assumption is right. Originally we had an NT based domain and everyone ran everything locally. The IBM was the dataserver in that network. Before that it was just peer to peer Windows 98... before that, I dunno.

Regarding the ram issue - so if we would upgrade the hardware running our TS to something more capable (like 4-core Xeons with 8-16gb ram), would we see any benefit? I figured server 2003 would have the same 4gb ram cap as Windows XP, being a 32 bit OS. If we could just replace the hardware we could probably delay upgrading to Win2008 for quite a few years. Our current Win2003 TS setup does everything we need, just seems to run a bit slow sometimes. But, we're running Win2003 Standard Edition, so I understand that it can only use 4GB's ram, so we're kind of stuck there.

For the life of me I can not understand why we went to this 3 computer system. If there is a 5% chance that a computer will be down, and assuming we need all 3 functions in order to operate as an office (TS, mail and data storage) we now have a 15% chance of being down at any given time. If we had redundancy of some sort I could see separating roles onto different computers, but as it stands now any one of our servers being down would effectively bring us to our knees.



 

drebo

Diamond Member
Feb 24, 2006
7,034
1
81
Originally posted by: Kenazo
I figured server 2003 would have the same 4gb ram cap

Standard has the 4gb cap. Enterprise is capped at 8gb. x64 is essentially uncapped.

Also, it's considered bad practice to run either Exchange or Terminal Services on a domain controller. I wouldn't run either.

Also, packaging everything into a single computer is not the way to increase reliability. For instance, if that server goes down, now you lose mail, applications, AND data. Separate them out and you lose one or the other. Without mail, you can still run your business. Obviously, applications and data are important, but that's what robust systems and good backups are for.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Kenazo
Currently we have a Windows 2003 terminal server set up which all of our users (25 accountants) connect to. All of our data is located on a 2nd computer connected through a gigabit connection to the terminal server. Is this good practice? To me it seems like it's just asking for slow downs.

How else would you handle something like this? Both computers are running RAID 5 SCSI arrays. Would you just have the terminal server computer have a second RAID controller and have the data array connected directly?

Twenty five TS connections is generally nothing with newer hardware. With the hardware you have you are pushing the limit. That is if every user signs on and starts being active. In addition you are acting as a domain controller. If the DC is being utilized you should expect severe slowdowns. (Depends on how many users in AD)

Nothing wrong in keeping the data on a second computer directly connected. In fact if you put the data on your Domain controller you are asking for more trouble.

Purchase a Quad Opteron 8356 or dual 2384, if you are stuck on intel purchase a Quad 7350. The dual 2384 will run 40+ TS connections with little problem.

**EDIT**

In fact never put data on a domain controller, always keep them seperate.
 

Kenazo

Lifer
Sep 15, 2000
10,429
1
81
What exactly does the domain controller do that necessitates separating its function to another computer? Is being the DC a processing intensive task?

by the way, I realize that this stuff is out of my league, but I'm just trying to get an idea of where we should be going before I meet with the IT people at our local store. They know what they're doing, but I want to know what they're doing. :)
 

RebateMonger

Elite Member
Dec 24, 2005
11,586
0
0
What does your current CPU and disk utilization look like for those three machines?

Can you switch to a Windows-based database?
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Kenazo
What exactly does the domain controller do that necessitates separating its function to another computer? Is being the DC a processing intensive task?

by the way, I realize that this stuff is out of my league, but I'm just trying to get an idea of where we should be going before I meet with the IT people at our local store. They know what they're doing, but I want to know what they're doing. :)

Domain controllers act as logon and interact with lighthouse or similar programs. They interact with quite a few software programs to determine single signon etc... Generally they are busy during the morning for logon times but if you have programs that go beyond that it will always be busy.

AD interacts with:

Exchange
Lighthouse
logon scripts
Some printer apps
QIP or some other IP management software etc...

Why would you want data on a DC to begin with? It is not good practice as if you lose that PC you lose data and a DC.
 

wlee

Senior member
Oct 10, 1999
585
0
71
Yes, you're correct. 32bit Standard Edition 2k3 only supports 4GB. 2k3 Enterprise supports more. You do realize that in the case of Server 2k3 SBS, it acts as BOTH DC and file server in the same box, right ? Local memory/disk backbone is always faster than network speed. ( yea, I know, SBS doesn't allow TS ) I will agree, where possible, it's ALWAYS better to have a dedicated DC. Did you read his original post ? All of his workstations, except 2, are being used as thin clients only. There is no need to waste the overhead of having any of them in the domain at all. I know there are those admins that will INSIST that any machine on their network either join the domain or get the frak out, But it's NOT neccessary. If those 2 stand alone stations are using roaming profiles, I could see some serious overhead generated by them, but if it's just share permissions, etc., then after login, all that should be cached. I'm also *ASS*uming in this case that they are using IMAP/MAPI for email so there would be no MASSIVE .pst file to bang at every domain ( roaming )login.


Anyway, I think in his case, *IF* he has the budget, to get a new more powerful server with 64bit and as much ram as posssible/affordable. ( again, assuming his apps work on 64bit ) Migrate the TS accounts to that, and re-task the old box as a PDC. Junk the old IBM server completely as the leaking caps makes it an unreliable time-bomb.
 

RebateMonger

Elite Member
Dec 24, 2005
11,586
0
0
Yeah. I'd do one of two things:

Option 1
1) Buy a new server with multi-core CPU and lots of memory, buy a copy of Windows Server 2003 x64 or Server 2008 x64 (depending on whether you want the TS options of 2003 and whether you want to pay for some new Client licenses) and make it into a Terminal Server.

2) Migrate the data server onto the existing Xeon server and make it the Domain Controller. To do this, you'd want to install the current copy of Windows Server 2003 x32 on the new data server, and will need to migrate to a Windows database program.

Option 2
1) Buy a new server as above. Buy a copy of Windows Server 2008 x64.

2) Install Server 2008 x64 with the Hyper-V role.

3) In two virtual machines, install the old Server 2003 x32 and make it the databse server and DC. Install Server 2008 x64 (the second, virtual, license that comes with the new Server 2008 x64 purchase) and make it a Terminal Server.

The second option gives you one less physical server and gives you a "direct" connection (using the Hyper-V "synthetic" NICs) between TS and the database (likely using different physical hard drive arrays for each virtual machine).

Depending on your messaging traffic, you MIGHT be able to put Exchange 2007 on the same Hyper-V box. You can always try virtualizing it and see how it works for you. If you did that, you could create a separate physical DC using the existing Dual Xeon server and could make the Data Server a second DC.

Or you could continue with a single virtual DC and only have a single physical server.