Hello, this is my first post here at the hallowed AT forums.
I have been reading through for years and finally decided on posting a question.
I will try and provide more than enough information on the specifics of my problem to possibly help figure this out.
Okay, I live in an apartment and with my friend MONKEY in the apartment next to me sharing the stairs and below him is my partner with whom I work with.
We all use his free internet (works at time warner.) I purchased the very nice dlink 4300 router because we already had a gigabit network setup and were having problems with our azureus setup killing the internet.
We have all 3 apartments cat5e networked together all on gigabit equipment.
I have the server (in my closet of course) that we all use for file sharing. We use the server quite heavily as my partner and I are self employed pc technicians full time doing data backups or moving files around etc.
I have upgraded the server multiple times hoping for faster, quieter, stronger, bigger, as I am a perfectionist.
I built this server on the very good foxconn geforce 6150 based motherboard that every machine we have is based on as well. We all use the intergrated gigabit and various pcie nvidia cards (noones using onboard except the server). I have 4 seagate 300gb sata drives set to raid 0 + 1 using the foxconns built in nvidia raid controller. When I copy over data via windows file/print sharing lets say a 4gb dvd image, using task manager on both the server and my personal machine I get roughly 40-50 megabytes per second in either direction.
Now, the problem I am having is this: I have a cs source server setup on that server as well. When some friends visit for a lan party, that guy who doesnt want to play games (everyone knows the type because he sucks at games) begins to copy files from the server to or from his computer. All of a sudden our game lags out and pings go rediculous.
I check the servers cpu usage and without file sharing we use around 50-60 percent (we have around 50 bots playing human vs zombie very fun) on the source dedicated server. The file sharing uses basically 1-1 ratio proportionately of cpu usage to the percentage of use the servers network adapter is experiencing.
WHY!!!!!
I read somewhere that the Nforce 4 chipset has a thing called "ip offload" which should move network usage off of the cpu, but then I began to wonder if windows file sharing in general is just a cpu hog.
I currently have an athlon 64 3000 venice core in the server and have also tried an athlon 64 3500 clawhammer core with no differences whatsoever.
I am wondering if I should just build a separate server to handle other tasks besides file sharing or perhaps just bite the bullet and buy a terra station?
I also tried the same on several machines all with the same results. Even a 10/100 network card @ 80 percent usage would peg the cpu out @ 80 percent as well.
I have been reading through for years and finally decided on posting a question.
I will try and provide more than enough information on the specifics of my problem to possibly help figure this out.
Okay, I live in an apartment and with my friend MONKEY in the apartment next to me sharing the stairs and below him is my partner with whom I work with.
We all use his free internet (works at time warner.) I purchased the very nice dlink 4300 router because we already had a gigabit network setup and were having problems with our azureus setup killing the internet.
We have all 3 apartments cat5e networked together all on gigabit equipment.
I have the server (in my closet of course) that we all use for file sharing. We use the server quite heavily as my partner and I are self employed pc technicians full time doing data backups or moving files around etc.
I have upgraded the server multiple times hoping for faster, quieter, stronger, bigger, as I am a perfectionist.
I built this server on the very good foxconn geforce 6150 based motherboard that every machine we have is based on as well. We all use the intergrated gigabit and various pcie nvidia cards (noones using onboard except the server). I have 4 seagate 300gb sata drives set to raid 0 + 1 using the foxconns built in nvidia raid controller. When I copy over data via windows file/print sharing lets say a 4gb dvd image, using task manager on both the server and my personal machine I get roughly 40-50 megabytes per second in either direction.
Now, the problem I am having is this: I have a cs source server setup on that server as well. When some friends visit for a lan party, that guy who doesnt want to play games (everyone knows the type because he sucks at games) begins to copy files from the server to or from his computer. All of a sudden our game lags out and pings go rediculous.
I check the servers cpu usage and without file sharing we use around 50-60 percent (we have around 50 bots playing human vs zombie very fun) on the source dedicated server. The file sharing uses basically 1-1 ratio proportionately of cpu usage to the percentage of use the servers network adapter is experiencing.
WHY!!!!!
I read somewhere that the Nforce 4 chipset has a thing called "ip offload" which should move network usage off of the cpu, but then I began to wonder if windows file sharing in general is just a cpu hog.
I currently have an athlon 64 3000 venice core in the server and have also tried an athlon 64 3500 clawhammer core with no differences whatsoever.
I am wondering if I should just build a separate server to handle other tasks besides file sharing or perhaps just bite the bullet and buy a terra station?
I also tried the same on several machines all with the same results. Even a 10/100 network card @ 80 percent usage would peg the cpu out @ 80 percent as well.
