Go Back   AnandTech Forums > Hardware and Technology > CPUs and Overclocking

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 02-21-2013, 03:56 PM   #776
sefsefsefsef
Member
 
Join Date: Jun 2007
Posts: 195
Default

Why can't RTSes be multithreaded easily? What part doesn't scale out to many cores?

Anyway, the announced PS4 specs are the best case I could have hoped for. It is now MS's game to lose.
sefsefsefsef is offline   Reply With Quote
Old 02-21-2013, 04:52 PM   #777
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,232
Default

Quote:
Originally Posted by sefsefsefsef View Post
Why can't RTSes be multithreaded easily? What part doesn't scale out to many cores?
The Intel and nVidia money would be my guess.

I would think that could be built in without too much extra work. 3D RTSes have been around long enough I have serious doubts that actually keeping up with unit activity is a major bottleneck, on modern computers. My guess is that they looked at user surveys and saw very few 4C, or even 2C4T, users, and figured it wasn't worth it, for SCII, compared to eeking out a little more performance from 2 cores. They can only throw so many millions of dollars at it .

Quote:
Anyway, the announced PS4 specs are the best case I could have hoped for. It is now MS's game to lose.
I wouldn't go that far, but if the cores aren't too slow, if it isn't strapped for CPU cache, and/or has some special local store(s) (such as eDRAM), it aught to be pretty fair.
__________________
"The computer can't tell you the emotional story. It can give you the exact mathematical design, but what's missing is the eyebrows." - Frank Zappa
Cerb is online now   Reply With Quote
Old 02-21-2013, 05:01 PM   #778
NUSNA_Moebius
Golden Member
 
NUSNA_Moebius's Avatar
 
Join Date: Oct 2010
Location: USA
Posts: 1,300
Default

I think part of the 8 core Jaguar reasoning comes from developers being so used to heavily multithreading current console titles and that it's likely the background processes planned for the PS4 (downloading while gaming, recording, etc) will likely be given either an entire core or two to keep everything nice and smooth. It's a small, power efficient core, with a decent amount of processing capability and flexibility. Background tasks are also the same reason why they went with 8 GB of GDDR5 instead of 4 GB I'm guessing. 4 GB would've been just about perfect for standard gaming console, but the background processes are going to demand quite a bit of RAM. However, I must say, even though it's unified, 176 GB/s as theorized seems a bit overly necessary for a console even in regards to the simultaneous use by the CPU, GPU, and perhaps by dedicated compression and decompression units. It might be prudent and cost effective to move down to a 192 bit bus with 6 GB which would still be about 132 GB/s.

Right now the PS4 is looking to be an expensive machine, just by GPU and RAM alone. Supposedly is all on an APU die which certainly helps. Sony can't sell another $600 console, and loss-leader hardware is not a prudent business model. $400 is the max I would pay for a new console. If the PS4 fails, it would likely bring the entire Sony down with it. With all the social BS planned for it (and likely the Nextbox will be so social focused as well) I'm somewhat hoping it fails.
__________________
MY YOUTUBE CHANNEL! - Current muses: Ghost Recon: Phantoms, War Thunder, Firefall

Geezer computer - Phenom II x4 @ 2.8 GHz | Diamond ATi Radeon 5850 1 GB | 2 x 2 GB DDR3-1600 | MSI 770-C45 Mobo | Asus Xonar DS Sound card | Win7 64 | 1 TB HDD | LG 32" HDTV

Last edited by NUSNA_Moebius; 02-21-2013 at 05:07 PM.
NUSNA_Moebius is offline   Reply With Quote
Old 02-21-2013, 05:13 PM   #779
cplusplus
Member
 
Join Date: Apr 2005
Posts: 91
Default

Quote:
Originally Posted by Lonbjerg View Post
So that is why Carmack, EPIC and Crytek all state that the closed, long loop of console hardware is inferior to the PC?

You just witnessed the stagnation being set for the next 5-6-7 years...
Given the same hardware, a closed system will perform better than an open system version of the same hardware because there is/can be less abstraction/translation with the instructions. Like how writing efficient code in assembly will run faster than efficient code to do the same thing in C. In the PS4 specifically, it will be writing code in libgcm as opposed to writing code in OpenGL.
cplusplus is offline   Reply With Quote
Old 02-21-2013, 05:20 PM   #780
Lonbjerg
Banned
 
Join Date: Dec 2009
Location: Denmark
Posts: 4,426
Default

RTS can be multicore?
http://au.gamespot.com/features/6166198/p-6.html

http://www.youtube.com/watch?v=UyeylvL2-j8
Lonbjerg is offline   Reply With Quote
Old 02-21-2013, 06:17 PM   #781
Skurge
Diamond Member
 
Skurge's Avatar
 
Join Date: Aug 2009
Location: Namibia
Posts: 4,927
Default

Quote:
Originally Posted by Lonbjerg View Post
yep, I was just about to mention that. No reason why an RTS can't be well threaded.
__________________
Intel Core i5-4670K |MSI Z97-Gaming 5|32GB DDR3-1600|Gigabyte R9 290 Windforce CF [stock]|Samsung SSD 840 Evo 500GB|Corsair AX860 PSU|Corsair 750D|Windows 8.1 Pro|Samsung U28D590D|Logitech G27 Racing Wheel|Nexus 5 32GB
Skurge is offline   Reply With Quote
Old 02-21-2013, 06:20 PM   #782
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,232
Default

Quote:
Originally Posted by NUSNA_Moebius View Post
I think part of the 8 core Jaguar reasoning comes from developers being so used to heavily multithreading current console titles and that it's likely the background processes planned for the PS4 (downloading while gaming, recording, etc) will likely be given either an entire core or two to keep everything nice and smooth. It's a small, power efficient core, with a decent amount of processing capability and flexibility. Background tasks are also the same reason why they went with 8 GB of GDDR5 instead of 4 GB I'm guessing. 4 GB would've been just about perfect for standard gaming console
I doubt it. 4GB would make it barely as capable as a PC from a few years ago. RAM is historically cheap, so skimping out on it would be far worse than skimping out on the CPU power, or GPU bandwidth.

Take Skyrim on the XB360, FI. It's a total joke. On a nice PC, however, not even a really expensive one, it plays fine, and eats up that RAM. It may not need 8GB, but it'll use 4-6GB all day long, and fixed HW with a minimal OS is only going to gain you a few hundred MBs. Give it 4GB, or 1GB video card (like mine ), it gets choppy. make it fuzzy, and the choppiness goes away.

I don't doubt there will be background tasks, but not ny that take up too much RAM, and the processor-heavy ones will surely have acceleration hardware dedicated to them (video encoding DSPs, FI, are small, low-power, and high-bandwidth).

The last gen of consoles had too little RAM. Out of the gate we already had them beat by miles, and the gap only widened. With a shared 8GB of high-bandwidth RAM, they will be able to arbitrarily use higher-res textures, where it can make a difference, rather than having to skimp from day one. It gives them rough equivalence to a video card with 2-4GB VRAM. They might actually be able to make games that look decent, for a few years, and gracefully degrade beyond that (non-integer upscaling has always looked bad, blurry shader AA has always looked bad, bloom has always looked bad, textures have stayed too fuzzy, etc.). Much like CPUs, amount has started to matter more than its bandwidth, and GPUs have efficient caches, so IMO, it makes sense purely from a gaming standpoint. Unless you're one of those people that doesn't think bloom-laden upscaled fuzzy scenes, with uneven framerates, look like crap.
__________________
"The computer can't tell you the emotional story. It can give you the exact mathematical design, but what's missing is the eyebrows." - Frank Zappa
Cerb is online now   Reply With Quote
Old 02-21-2013, 06:57 PM   #783
podspi
Golden Member
 
podspi's Avatar
 
Join Date: Jan 2011
Location: USA
Posts: 1,700
Default

Correct me if I'm wrong, but won't Jaguar have around Core 2 levels of IPC? Octa-core Core 2 (even at 1.6ghz) sounds impressive enough to me. Is there any word on turbo ability? Given the fact that AMD's turbo is deterministic, it should be relatively safe to enable...

Either way, it is going to embarrass the Wii U...
podspi is offline   Reply With Quote
Old 02-21-2013, 07:04 PM   #784
MrPickins
Diamond Member
 
MrPickins's Avatar
 
Join Date: May 2003
Location: Austin, TX
Posts: 7,352
Default

Quote:
Originally Posted by Dresdenboy View Post
With a smart OS scheduler there is no need for such core pinning. Also I wouldn't consider a system with an OS and background tasks (virus scanners in a closed environment?) needing multiple 1.x GHz cores efficient.
For MS, it may not make as much sense (assuming they're using a Windows variant), but I'm not sure Sony would put forth the effort on the scheduler when pinning it to one core would be so easy and cost efficient from an R&D standpoint. That, and it reminds me of how they reserved one SPE in the PS3 for the OS.

Additionally, I would think that reserving a fixed number of cores would make for a more consistent platform for which to code. You wouldn't have to worry about your threads being preempted by OS threads. You can guarantee that your allotted cpu resources have high availability.

Of course, this all hinges on my belief that both manufacturers plan on having many simultaneous background tasks running for a large percentage of play time (streaming shared video, downloading, voice chat, Kinect/Move, etc).
__________________
Most Likely :
Last edited by MrPickins; Some Day at Some time.
MrPickins is online now   Reply With Quote
Old 02-21-2013, 07:11 PM   #785
MrPickins
Diamond Member
 
MrPickins's Avatar
 
Join Date: May 2003
Location: Austin, TX
Posts: 7,352
Default

Quote:
Originally Posted by Cerb View Post
...

Take Skyrim on the XB360, FI. It's a total joke. On a nice PC, however, not even a really expensive one, it plays fine, and eats up that RAM. It may not need 8GB, but it'll use 4-6GB all day long, and fixed HW with a minimal OS is only going to gain you a few hundred MBs. Give it 4GB, or 1GB video card (like mine ), it gets choppy. make it fuzzy, and the choppiness goes away.
...
Using high res texture pack mods, I assume?

I never see that much system RAM used when playing unmodded Skyrim.
__________________
Most Likely :
Last edited by MrPickins; Some Day at Some time.
MrPickins is online now   Reply With Quote
Old 02-21-2013, 08:30 PM   #786
Olikan
Golden Member
 
Olikan's Avatar
 
Join Date: Sep 2011
Posts: 1,825
Default

Quote:
Originally Posted by podspi View Post
Correct me if I'm wrong, but won't Jaguar have around Core 2 levels of IPC? Octa-core Core 2 (even at 1.6ghz) sounds impressive enough to me. Is there any word on turbo ability? Given the fact that AMD's turbo is deterministic, it should be relatively safe to enable...

Either way, it is going to embarrass the Wii U...
nop...it's ends up in the middle of K8 to a K10.... just a bit lower of bulldozer's ipc

didn't found anything about turbo... but turbo is a waste if developers can use all the resources easily
__________________
Quote:
I must be dyslexic, because every time I look at your name I see OilKan!
Olikan is offline   Reply With Quote
Old 02-21-2013, 08:59 PM   #787
Cerb
Elite Member
 
Cerb's Avatar
 
Join Date: Aug 2000
Posts: 15,232
Default

Quote:
Originally Posted by MrPickins View Post
Using high res texture pack mods, I assume?

I never see that much system RAM used when playing unmodded Skyrim.
Well, it's not technical issue in itself, but there also that: stock Bethesda games miss the point of playing Bethesda games .
__________________
"The computer can't tell you the emotional story. It can give you the exact mathematical design, but what's missing is the eyebrows." - Frank Zappa
Cerb is online now   Reply With Quote
Old 02-21-2013, 09:23 PM   #788
lopri
Elite Member
 
lopri's Avatar
 
Join Date: Jul 2002
Posts: 9,765
Default

Quote:
Originally Posted by Lonbjerg View Post
Anything can be multi-threaded. I thought we're talking about degrees. I will reiterate: RTS isn't as suitable for multi-threading as other genres of games are. Vastly so. I don't feel like explaining why because I thought it had been a well discussed topic. (hint: human interaction) Every other game can be multi-threaded beautifully and I believe that's the way to go moving forward.

As for that Supreme Commando BS, that's what it is - BS. At that time I downloaded demo, bought the game (didn't even like that game), and applied "performance-enhancement" patch only to learn the whole propaganda was BS. And I kept some screenshots. Dug them out and here is one. Try yourselves - you should be able to find the game cheap in a bargain bin somewhere.



Spec: Q6600 OC'ed + 8800 GTX.
lopri is offline   Reply With Quote
Old 02-21-2013, 09:25 PM   #789
itsmydamnation
Senior Member
 
Join Date: Feb 2011
Posts: 613
Default

Quote:
Originally Posted by Olikan View Post
nop...it's ends up in the middle of K8 to a K10.... just a bit lower of bulldozer's ipc

didn't found anything about turbo... but turbo is a waste if developers can use all the resources easily
so where are these benchmarks of jaguar?
itsmydamnation is offline   Reply With Quote
Old 02-21-2013, 09:58 PM   #790
Roland00Address
Golden Member
 
Join Date: Dec 2008
Posts: 1,261
Default

Quote:
Originally Posted by itsmydamnation View Post
so where are these benchmarks of jaguar?
Jaguar hasn't come out and it has not been previewed by any reliable source.

Right now any numbers are pure speculation based on AMD marketing slides (in other words the info is completely unreliable.)
Roland00Address is online now   Reply With Quote
Old 02-21-2013, 10:08 PM   #791
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,494
Default

Quote:
Originally Posted by podspi View Post
Correct me if I'm wrong, but won't Jaguar have around Core 2 levels of IPC? Octa-core Core 2 (even at 1.6ghz) sounds impressive enough to me. Is there any word on turbo ability? Given the fact that AMD's turbo is deterministic, it should be relatively safe to enable...

Either way, it is going to embarrass the Wii U...
At 1.4GHz, it gets 1.4 in Cinebench R11.5.

In comparison, the Core 2 Quad Q9000 running at 2GHz gets 2.4 in that same benchmark.

I'd say its now maybe 5% above the original Athlon 64 in terms of "IPC".
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC
IntelUser2000 is offline   Reply With Quote
Old 02-21-2013, 11:02 PM   #792
itsmydamnation
Senior Member
 
Join Date: Feb 2011
Posts: 613
Default

Quote:
Originally Posted by IntelUser2000 View Post
At 1.4GHz, it gets 1.4 in Cinebench R11.5.

In comparison, the Core 2 Quad Q9000 running at 2GHz gets 2.4 in that same benchmark.

I'd say its now maybe 5% above the original Athlon 64 in terms of "IPC".
again links to benchmarks please?
itsmydamnation is offline   Reply With Quote
Old 02-21-2013, 11:02 PM   #793
podspi
Golden Member
 
podspi's Avatar
 
Join Date: Jan 2011
Location: USA
Posts: 1,700
Default

Quote:
Originally Posted by IntelUser2000 View Post
At 1.4GHz, it gets 1.4 in Cinebench R11.5.

In comparison, the Core 2 Quad Q9000 running at 2GHz gets 2.4 in that same benchmark.

I'd say its now maybe 5% above the original Athlon 64 in terms of "IPC".
Are you referring to Bobcat or Jaguar? If Bobcat, after adjusting for clock speed and projected IPC gain (heh) that puts it around Core 2 IPC (though still a bit slower than Core 2).

I still stand by my statement, I think the CPU will be fine. Obviously if you are a hardcore PC gamer these specs will disappoint, but for most console gamers this is a huge step up. Hopefully the machines will be both cheaper and have shorter lifecycles. Moving forward, backwards compatibility should be much easier if they stick with x86.
podspi is offline   Reply With Quote
Old 02-21-2013, 11:45 PM   #794
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,494
Default

No, I mean Jaguar. Bobcat were decent amount behind Athlon 64, let alone Core 2.

http://www.planet3dnow.de/cgi-bin/ne...?id=1361486916

Based on the performance and power figures, I think if we can normalize things it'd have similar perf/watt to the Samsung Exynos "Octa".
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC
IntelUser2000 is offline   Reply With Quote
Old 02-22-2013, 12:11 AM   #795
poohbear
Golden Member
 
Join Date: Mar 2003
Location: Toronto, Canada
Posts: 1,922
Default

Quote:
Originally Posted by ShintaiDK View Post
Lets just call it a HD7850 since its within 5% of it. And its far far away from a HD7870.
That's not true at all, it might have the same "GFLOPS", but that does NOT mean it translates to the PC equivalent video card. Crysis 3 is rendered on an 8 year old PS3, i don't know any 8 year old PC video card that can play Crysis 3 with the same amount of detail as the PS3. These GPUs are in a console using software that is optimized strictly for gaming, so the results are very different and far superior than the PC video card "equivalent" in GFLOPS. If anything it would probably have the same performance as a 7950/7970 card on a PC in terms of performance.

Software optimizations are a huge part of how a game can be rendered, just look at BF3's improvement in the AMD 79xxx series with driver updates, it got a 30-40% increase in performance with optimized drivers because the engineers focused solely on that game due to its popularity.
__________________
Desktop: 4790k | Noctua NH-D14 | 16GB (2x8gb) Crucial Ballistex @ CL9 | Asrock Z97 OC Formula | Gigabyte GTX 670 SLI | 250gb Samsung 840 Evo & 240gb OCZ Vertex 3 MI & 2TB WD Black | Auzentech Forte 7.1 | Seasonic 760wt Platinum | DELL U2711 @ 1440p | Corsair 300R | Win 8.1
Ultrabook: Zenbook UX32LN | i5 4200u | 8GB RAM | Nvidia 840m | IPS Matte @ 1080p | 256GB SSD | Win8.1

Last edited by poohbear; 02-22-2013 at 12:18 AM.
poohbear is offline   Reply With Quote
Old 02-22-2013, 01:49 AM   #796
2is
Platinum Member
 
Join Date: Apr 2012
Posts: 2,608
Default

HIGHLY unlikely it's anywhere near a 7970, optimizations only go so far. Lets not forget many games are rendered at very low resolutions then up converted. I'd say the sun going nova this year as about as likely as an APU matching 7970 (most powerful GPU around until titan is available) performance.
__________________
Intel i7 3770K|240GB Intel SSD 520|Asus P8Z77-V Pro|2x GTX 680 SLI (2GB)|180GB Corsair Force SSD|Corsair TX750|2x8GB DDR3 1600 (1.35v)
2is is offline   Reply With Quote
Old 02-22-2013, 01:58 AM   #797
inf64
Platinum Member
 
inf64's Avatar
 
Join Date: Mar 2011
Posts: 2,045
Default

Quote:
Originally Posted by IntelUser2000 View Post
At 1.4GHz, it gets 1.4 in Cinebench R11.5.

In comparison, the Core 2 Quad Q9000 running at 2GHz gets 2.4 in that same benchmark.

I'd say its now maybe 5% above the original Athlon 64 in terms of "IPC".
5% in integer tasks(remember K10 was 10-15% faster in same scenario than K8 ). In SSE/fp intensive tasks Jaguar will be ~2x faster than Bobcat/K8 since it has 128bit fp pipelines vs 64bit in the case of former two.
__________________
ShintaiDK:"There will be no APU in PS4 and Xbox720."
ShintaiDK:"No quadchannel either.[in Kaveri]"
CHADBOGA:"Because he[OBR] is a great man."
inf64 is offline   Reply With Quote
Old 02-22-2013, 02:38 AM   #798
krumme
Platinum Member
 
Join Date: Oct 2009
Posts: 2,088
Default

Quote:
Originally Posted by inf64 View Post
5% in integer tasks(remember K10 was 10-15% faster in same scenario than K8 ). In SSE/fp intensive tasks Jaguar will be ~2x faster than Bobcat/K8 since it has 128bit fp pipelines vs 64bit in the case of former two.
How is the gaming workload for fp/integer vs. normal office use? - i would guess its more FP heavy but thats just an assumption?

They question is also, why have an 128bit fp pipeline in jaguar, as its quite a shift to bobcat? (still the core is 3.1mm2 sans l2, but isnt the FPU power expensive too?)
krumme is offline   Reply With Quote
Old 02-22-2013, 02:43 AM   #799
poohbear
Golden Member
 
Join Date: Mar 2003
Location: Toronto, Canada
Posts: 1,922
Default

Quote:
Originally Posted by 2is View Post
HIGHLY unlikely it's anywhere near a 7970, optimizations only go so far. Lets not forget many games are rendered at very low resolutions then up converted. I'd say the sun going nova this year as about as likely as an APU matching 7970 (most powerful GPU around until titan is available) performance.
if a PS3 that's 8 years old can play Crysis 3 and look that good, yes a 7970. In 8 years (2020) can a 7970 play a state of the art game? Probably just barely if at all, hence why i'd say a PS4 is as powerful as a 7950/7970. Think about it, a Geforce 7900GTX or a Radeon X1900 XTX were state of the art in 2006, but no way can they play Crysis 3 @ the same level of detail as a PS3.
__________________
Desktop: 4790k | Noctua NH-D14 | 16GB (2x8gb) Crucial Ballistex @ CL9 | Asrock Z97 OC Formula | Gigabyte GTX 670 SLI | 250gb Samsung 840 Evo & 240gb OCZ Vertex 3 MI & 2TB WD Black | Auzentech Forte 7.1 | Seasonic 760wt Platinum | DELL U2711 @ 1440p | Corsair 300R | Win 8.1
Ultrabook: Zenbook UX32LN | i5 4200u | 8GB RAM | Nvidia 840m | IPS Matte @ 1080p | 256GB SSD | Win8.1

Last edited by poohbear; 02-22-2013 at 02:46 AM.
poohbear is offline   Reply With Quote
Old 02-22-2013, 02:45 AM   #800
BallaTheFeared
Diamond Member
 
BallaTheFeared's Avatar
 
Join Date: Nov 2010
Posts: 8,128
Default

Quote:
Originally Posted by krumme View Post
How is the gaming workload for fp/integer vs. normal office use? - i would guess its more FP heavy but thats just an assumption?

They question is also, why have an 128bit fp pipeline in jaguar, as its quite a shift to bobcat? (still the core is 3.1mm2 sans l2, but isnt the FPU power expensive too?)
PC gaming is pretty much integer depended, that's why AVX2 is much more important to our segment than AVX could ever be (which was worthless).

I wonder if the cores in these consoles will support it, that would really help drive code implementation and design for PCs.
BallaTheFeared is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 08:57 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.