What GPGPU applications are available to ATI users?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Scali

Banned
Dec 3, 2004
2,495
0
0
Why nVidia? ATI hardware is about 50% more powerful (in terms of FLOPS), but nVidia provides CUDA which allows compilation for the GPU of native C code, Fotran code, and recently even C++ code. With ATI you have to learn a unique programming language of ATI's and rewrite your program for it which is significantly more difficult then using CUDA.

Another factor is that ATi's architecture is less efficient than nVidia's.
In other words, even though on paper ATi may have the GFLOPS advantage, it may not translate into a faster GPGPU application, because it's easier to extract the GFLOPS from the nVidia architecture in practice.
A nice example of that is Folding@home... nVidia GPUs run circles around ATi GPUs, because they can just use the GPU more efficiently (because of the way nVidia implemented its shared cache).
GFLOPS ratings really don't mean anything, since they don't have any indication of efficiency (not that it is really possible to do so, because the efficiency can vary greatly between one algorithm and the next).
Even in graphics we've seen that... the GTX280 had a lower GFLOPS rating than the 4870, but it was the faster videocard in pretty much every game, because it was more efficient. In GPGPU things only get worse, since most algorithms are not as straightforward and embarassingly parallel as triangle rendering.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
That's what you might think, based on the overall marketshare... but it depends very much on the application.
For example, although Apache exists for both Windows and linux, Apache is primarily used on linux systems.
Likewise PhotoShop used to be a typical Mac application.

photoshop used to be primarily mac because powerPC was 64bit much earlier then x86. However, nowadays x86 has 64bit and is much much faster. As a result, some of adobe's newest tools are windows only, none are mac only. (and macs nowadays use x86 anyways)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
photoshop used to be primarily mac because powerPC was 64bit much earlier then x86. However, nowadays x86 has 64bit and is much much faster. As a result, some of adobe's newest tools are windows only, none are mac only. (and macs nowadays use x86 anyways)

No, PhotoShop used to be primarily Mac because it was developed on Mac, for Mac (this wasn't even in the PowerPC era yet, they still used 68k processors).
When Windows versions emerged later, there was a lot of criticism because Windows didn't have proper support for colour profiles. So users still preferred Macs, as Windows wasn't really an option for professional use.
But as I say 'used to be'... as far as I know, the colour problems in Windows have been solved, and the hardware on both sides is equal, so I don't know if the majority of users stuck with the Mac anyway, or if they moved to Windows now.
But that's not really the point... The point I was making is that the marketshare of platforms for a specific application may not correspond at all with the overall marketshare. PhotoShop is a historic example of that at the least, even if it may not be the case today. But that's not relevant.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I think Ati suffers a bit from being part of AMD as far as software is concerned. AMD has always done x86 - you don't have to develop software for that, it's a pretty fixed standard, all you need is support for the latest additions and a compiler that actually uses it (i.e. not the Intel one). Hence AMD has been able to get away with a small software team.

This lack of priority on software runs over onto ATI.

Graphics and gpu's are very different - they change so much more rapidly so if you want devs to use the features you've got to give them a lot more help. Nvidia seem to have *got* this and have a huge software dev team, AMD are still in denial - all this "We just produce the hardware, it's up to devs to write the software not us" doesn't really work. Given the choice of having it all handed to you on a plate or having to write it all from first principles - well that isn't a choice is it?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
That's true, Dribble... but it goes further back than the AMD/ATi merger.
ATi never had a very strong software team either. Especially their OpenGL drivers were notoriously bad compared to nVidia. Driver quality was almost their downfall.

ATi realized this just in time, and started the new Catalyst driver program, with monthly updates.
They were also lucky that at this time Direct3D 9 came out, and became so dominant that OpenGL driver quality didn't really matter anymore. And Direct3D drivers have never had as many issues as OpenGL drivers.

So ATi never had a very strong reputation on the software side of things. What I don't get though, is that AMD never addressed this... especially considering their plans with Fusion. Why didn't they just hire a bunch of high-profile software guys and tackle this GPGPU thing head-on? What AMD currently does is just spreading their driver team too thin.

They made various silly mistakes aswell...
For example, their OpenCL SDK is for both their CPUs and their GPUs. Now I had a machine with an AMD CPU, but with an nVidia GPU. OpenCL couldn't run on the CPU, because they had references to the Stream runtime, and so the GPU drivers had to be installed on the system... which you can't do when you don't have a compatible AMD GPU in your system. So I had to manually extract the GPU drivers and copy the relevant DLLs into the proper place.
Those are beginner's mistakes.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
No, PhotoShop used to be primarily Mac because it was developed on Mac, for Mac (this wasn't even in the PowerPC era yet, they still used 68k processors).
When Windows versions emerged later, there was a lot of criticism because Windows didn't have proper support for colour profiles. So users still preferred Macs, as Windows wasn't really an option for professional use.

thanks for the correction.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I said I'm a developer.
Let me explain it to you: If us developers are hindered in developing and deploying OpenCL applications, you consumers aren't getting applications.
You think it's entirely coincidence that a major company like Adobe chooses Cuda for GPGPU-acceleration, rather than OpenCL?



I don't believe in Fusion. A combination of a sub-par CPU and a sub-par GPU on a single die, how amazing!

Well, I don't believe you are a developer especially after you apparently dont' even understand what AMD said and what's OpenCL.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
I think Ati suffers a bit from being part of AMD as far as software is concerned. AMD has always done x86 - you don't have to develop software for that, it's a pretty fixed standard, all you need is support for the latest additions and a compiler that actually uses it (i.e. not the Intel one). Hence AMD has been able to get away with a small software team.

This lack of priority on software runs over onto ATI.

Graphics and gpu's are very different - they change so much more rapidly so if you want devs to use the features you've got to give them a lot more help. Nvidia seem to have *got* this and have a huge software dev team, AMD are still in denial - all this "We just produce the hardware, it's up to devs to write the software not us" doesn't really work. Given the choice of having it all handed to you on a plate or having to write it all from first principles - well that isn't a choice is it?

AMD/ATI does not HAVE to do this, while Nvidia has no choice for survival, that's the point.

It's one thing if someone "does not believe in Fusion" - hey, there are people who believe climate isn't changing either - but it's coming and it completely cuts Nvidia off the market due to its lack of license to anything.

Nvidia is forced to carve out its own niche market otherwise CPU-GPU combos will destroy its market share.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Well, I don't believe you are a developer especially after you apparently dont' even understand what AMD said and what's OpenCL.

If anyone wants to verify my existence as a developer, they can easily google me and land on a number of open source projects, demos, presentations and forum discussions about programming. So I don't really care what you believe. You can be in denial for all I care...
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
AMD/ATI does not HAVE to do this, while Nvidia has no choice for survival, that's the point.

It's one thing if someone "does not believe in Fusion" - hey, there are people who believe climate isn't changing either - but it's coming and it completely cuts Nvidia off the market due to its lack of license to anything.

Nvidia is forced to carve out its own niche market otherwise CPU-GPU combos will destroy its market share.

The more I hear about the Fusion the less impressive ti sounds. It sounds like an upped integrated solution. It may take out Nvidia's sub 100 market. But I dont suspect it will do much more than that. And I dont see it decimating Intels stranglehold on the market either.

That said i agree with you Nvidia has to go into niche markets to survive. Which is why GPGPU will be huge for them if they can get into the HPC market. And why they are focusing on the mobile markets as well.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
If anyone wants to verify my existence as a developer, they can easily google me and land on a number of open source projects, demos, presentations and forum discussions about programming. So I don't really care what you believe.

I really don't care, to tell you the truth.

You can be in denial for all I care...

Speaking of denial: if anyone wants to verify what's coming in terms of CPU+GPU/Fusion, they can easily Google "Intel HD" or "AMD Fusion" and land on a number of official presentations, long documentations etc about how the landscape is changing already and why Nvidia has absolutely nothing to enter with in this race.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
I really don't care, to tell you the truth.

No, you just wanted to take a personal dig because you can't handle the fact that we don't share the same opinion.

Speaking of denial: if anyone wants to verify what's coming in terms of CPU+GPU/Fusion, they can easily Google "Intel HD" or "AMD Fusion" and land on a number of official presentations, long documentations etc about how the landscape is changing already and why Nvidia has absolutely nothing to enter with in this race.

I'm not denying anything.
Just saying that I don't think Fusion (or Intel's variation on the theme) is going to revolutionize discrete videocards as we know them. I don't want to go into the reasons why integrated GPUs are at a disadvantage compared to discrete videocards again... that was already discussed many times before in other threads.

nVidia has Tegra, by the way, which could become interesting if devices like the iPad and linux/Android-based netbooks and other portable/embedded devices continue to rise in popularity.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
I really don't care, to tell you the truth.
As a fellow programmer it's obvious that scali knows what he's talking about and has programmed some stuff in Cuda/openCl.. you? Don't think so.

Speaking of denial: if anyone wants to verify what's coming in terms of CPU+GPU/Fusion, they can easily Google "Intel HD" or "AMD Fusion" and land on a number of official presentations, long documentations etc about how the landscape is changing already and why Nvidia has absolutely nothing to enter with in this race.
Well we'll see, Amd obviously put a lot of money into it and we should better hope it turns out well if we want to have some competition in the CPU market in a few years from now on. Depends to a good part on the software side of things and it seems they've learnt their lessons from some less successful projects.
The concept itself is interesting and sounds like a good idea - but so did Itanium and we know how that turned out. If the only thing they accomplish is taking Nvidias (and Atis) lowlvl GPU market, I think we can consider it a failure (though undoubtly still extremely hurtful for nvidia), if their more ambitious goals work out as planned, it will be an extremely interesting time..

Only time will show.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
I think Ati suffers a bit from being part of AMD as far as software is concerned. AMD has always done x86 - you don't have to develop software for that, it's a pretty fixed standard, all you need is support for the latest additions and a compiler that actually uses it (i.e. not the Intel one). Hence AMD has been able to get away with a small software team.

This lack of priority on software runs over onto ATI.

Graphics and gpu's are very different - they change so much more rapidly so if you want devs to use the features you've got to give them a lot more help. Nvidia seem to have *got* this and have a huge software dev team, AMD are still in denial - all this "We just produce the hardware, it's up to devs to write the software not us" doesn't really work. Given the choice of having it all handed to you on a plate or having to write it all from first principles - well that isn't a choice is it?

I think you are overlooking a few factors:

- AMD was interested in merging with NVIDIA, but due to the company negotiations (CEO spot if my memory serves me right) that failed - that would have been the best for AMD and probably quite good for NVIDIA;

- ATi before being bought was already far behind NVIDIA in devs relations and their drivers didn't have a good reputation - If anything the ATi drivers improved since;

- At the time Intel launched Core 2 while AMD had that Barcelona debacle with TLB bugs and problems getting decent frequencies (imagine if the original phenom was more like Athlon II, which is pretty the same performance on a per clock basis, and could easily reach 3.5-3.6 GHz instead);

- ATi itself came from problems with X1800, which was late, and even though the X1900 was quite a good card on a performance level, coming late never help. And the X1600 wasn't a really answer vs the 7600 GT;

- Not only ATi had a previous bad generation, the first generation released after being bought, the X2000, was quite bad and LATE again!

Now:

- AMD got some of the processors problems sorted out - Phenom II and Athlon II are much better than Phenom I, even though they fall short of iCore, and AMD even released a X6 in the 45nm process that seems to be in the same power envelope of Deneb and OC to 4GHz (AMD is still losing badly except in the lower end);

- AMD got rid of their Fabs;

- In the GPU side ATi shake the "GPU world" with the 4000 series by bringing $500 performance to $300 mark and $300 performance to under $200 mark and actually was the first to release a DX 11 GPU;

- AMD is sampling the first APUs and have announced the Fusion Fund (which is something very unusual for AMD and we can only hope it means AMD is improving their devs relations).

If we look at all those failures and problems is amazing they are still around and it isn't hard to realize AMD had to prioritize and solve the problems it plagued them.
 

T2k

Golden Member
Feb 24, 2004
1,665
5
81
As a fellow programmer it's obvious that scali knows what he's talking about and has programmed some stuff in Cuda/openCl.. you? Don't think so.

Couldn't care less, as I said - I oversee an entire scientific viz firm's technology dept, I found this "I'm a fellow programmer" amusing (even if a bit childish)... FYI we develop our own apps here (GPU-based renderers, max and Fusion plugins etc) yet somehow I really don't think he's got what AMD is saying.

Well we'll see, Amd obviously put a lot of money into it and we should better hope it turns out well if we want to have some competition in the CPU market in a few years from now on. Depends to a good part on the software side of things and it seems they've learnt their lessons from some less successful projects.
It's not going to be an overnight switch and certainly won't kill the discrete market overnight - but steadily and slowly will eat it up (sans highest-end cards.) First it will engulf the mobile market and Nvidia has little to nothing to show there - they will have to literally lose money on every chip to convince system designers to add a Tegra/Optimus/whatever to a platform that comes with built-in DX11-level chip and full support from Intel or AMD. In the meantime it will slowly creep into desktops especially business ones and, heck, I could use another DX11-level chip in my machine especially when it's granted that it would work tandem with my matching-branded discrete card.

The concept itself is interesting and sounds like a good idea - but so did Itanium and we know how that turned out.
I think it doesn't share anything with the Itanium-story, sorry. Itanium only sounded positive until I got our first couple of machines - in short time we realized it's a completely botched execution because Intel waaaay overestimated its speed to raise the clock as well as their influence on ISV's, let alone their inability to turn out a well-optimized compiler.

OTOH here we don't need new compilers for X84/64 etc, it's more like a natural symbiosis between two HW pieces - as a matter of fact even The Great Leader JSH envisioned such a future around 2000 in a Wired interview, with 'display is the computer' or some similarly cheesy slogan. :D

Problem is NV's business practices clearly turned away everyone now, nobody will license anything meaningful to NV so they have no choice: CUDA or die. :D

If the only thing they accomplish is taking Nvidias (and Atis) lowlvl GPU market, I think we can consider it a failure (though undoubtly still extremely hurtful for nvidia), if their more ambitious goals work out as planned, it will be an extremely interesting time..

Only time will show.
Look, NV lost one-third of its entire revenue last year when chipset business dried up - how much did you know about it? :)
Now imagine if their mobile business would go down the toilet...

You need CRAZY cash flow for R&D and without sufficient revenue... I don't know...
If they cannot convince the world to use CUDA then the best I can imagine for NV is to turn into another Matrox: a rather small, niche market ruled almost alone while time-by-time releasing some discrete upgrade parts as well.
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
OTOH here we don't need new compilers for X84/64 etc, it's more like a natural symbiosis between two HW pieces - as a matter of fact even The Great Leader JSH envisioned such a future around 2000 in a Wired interview, with 'display is the computer' or some similarly cheesy slogan. :D
They still need to change the compilers so they take advantage of those features and so on - nothing to underestimate. But from what I see they seem to take that (and working together with devs) really serious this time, so that makes one hopeful. "Natural symbiosis" is a nice marketing gag, but code that was programmed for a CPU almost certainly won't run good on a GPU without some tweaks - though not sure how much they want the compiler to do.


Look, NV lost one-third of its entire revenue last year when chipset business dried up - how much did you know about it? :)
Now imagine if their mobile business would go down the toilet...
Yeah sure it would hurt them (and in the long term that market will disappear - no reason to have discrete GPUs for that), after all that's the reason why they're focus that much on Tegra and the HPC market. Though I think for Amd their real concern is Intel, after all Nvidia can't really harm them, so I'm more interested how Fusion turns out respective to x86 code and CPUs.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
They still need to change the compilers so they take advantage of those features and so on - nothing to underestimate. But from what I see they seem to take that (and working together with devs) really serious this time, so that makes one hopeful. "Natural symbiosis" is a nice marketing gag, but code that was programmed for a CPU almost certainly won't run good on a GPU without some tweaks - though not sure how much they want the compiler to do.

At least the first generation will be no different from a regular CPU and discrete GPU... The CPU runs x86 code, and the GPU can only be used via the driver, through one of the usual APIs (DirectX, OpenGL, OpenCL, Stream...).
The problem with integration is again related to marketshare...
The last time AMD tried to push their own instructionset, it didn't take off... 3DNow! was not a bad instructionset at all, but there just weren't enough CPUs on the market to get support from compilers and applications.
Likewise, Intel is now going to 'backport' some of the Larrabee extensions to x86, in future SSE/AVX extensions. AMD will probably not have much of a choice but to somehow make their CPU/GPUs compatible with Intel's extensions. Either that, or stick to the separate driver/API model, where you cannot exploit the integration factor at all.
Because as soon as it becomes a choice between one vendor and the other, AMD will lose from Intel, just like they did with 3DNow!.

So that's the thing here... AMD has been talking about Fusion for years, and has launched new terms like APU and 'Natural symbiosis'... But what are they actually going to DO? They never went into detail about either the hardware integration or the way they want to support Fusion with software, and how they expect developers to take advantage of it.
It all comes back to the software.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
The last time AMD tried to push their own instructionset, it didn't take off... 3DNow! was not a bad instructionset at all, but there just weren't enough CPUs on the market to get support from compilers and applications.
Well from what I can remember they didn't really worry about the software side of things with 3DNow! and just assumed that everything would work out some way and they'd get good compiler support somehow, whereas this time they seem to worry more about that part of the equation.

But yeah 3DNow! is exactly what I had in mind when I said, they hopefully have learnt about past mistakes ;)
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
At least the first generation will be no different from a regular CPU and discrete GPU... The CPU runs x86 code, and the GPU can only be used via the driver, through one of the usual APIs (DirectX, OpenGL, OpenCL, Stream...).
The problem with integration is again related to marketshare...
The last time AMD tried to push their own instructionset, it didn't take off... 3DNow! was not a bad instructionset at all, but there just weren't enough CPUs on the market to get support from compilers and applications.
Likewise, Intel is now going to 'backport' some of the Larrabee extensions to x86, in future SSE/AVX extensions. AMD will probably not have much of a choice but to somehow make their CPU/GPUs compatible with Intel's extensions. Either that, or stick to the separate driver/API model, where you cannot exploit the integration factor at all.
Because as soon as it becomes a choice between one vendor and the other, AMD will lose from Intel, just like they did with 3DNow!.

So that's the thing here... AMD has been talking about Fusion for years, and has launched new terms like APU and 'Natural symbiosis'... But what are they actually going to DO? They never went into detail about either the hardware integration or the way they want to support Fusion with software, and how they expect developers to take advantage of it.
It all comes back to the software.
Really, where did x64 come from?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Well from what I can remember they didn't really worry about the software side of things with 3DNow! and just assumed that everything would work out some way and they'd get good compiler support somehow, whereas this time they seem to worry more about that part of the equation.

You say they 'worry more'. What makes you say that? What are they doing differently this time?
I mean, Intel has its own compiler. Even if Microsoft would choose not to support an Intel extension (which probably will never happen), Intel has it covered.
AMD has no compiler, and there's no way they can stamp a decent compiler out of the ground in time for Fusion, and somehow make it integrate seamlessly with Visual Studio and convince all developers to use it.

But yeah 3DNow! is exactly what I had in mind when I said, they hopefully have learnt about past mistakes ;)

Well that's why I'm asking: What is AMD doing differently this time? I haven't seen any indication that they are doing anything different at all... just counting on MS and OpenCL to cover their backs.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Really, where did x64 come from?

That's not their own instructionset (as in: unique to AMD), it's shared with other x86-vendors.
It only worked because Intel supports it aswell. Case in point: Microsoft didn't release an x64 version of Windows until Intel had hardware in place.
In the case of 3DNow! it didn't take off because Intel ignored it. Intel is going to ignore whatever AMD might cook up with Fusion aswell.
 
Last edited:

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
That's not their own instructionset (as in: unique to AMD), it's shared with other x86-vendors.
It only worked because Intel supports it aswell. Case in point: Microsoft didn't release an x64 version of Windows until Intel had hardware in place.
In the case of 3DNow! it didn't take off because Intel ignored it. Intel is going to ignore whatever AMD might cook up with Fusion aswell.

Intel didn't have a choice, MS was pressuring them. Please give credit where it's due. AMD was the main drive behind it, period.

You certainly have knowledge but you also tend to lean against AMD. Even an idiot can see it in this thread.

Back to the topic. AMD needs to do a better job or at least increase their development pace when it comes to GPGPU.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
With AMD not being in a very strong financial position (and hasn't been for quite some time), it's no wonder that they can't push their own standard. Rallying up behind OpenCL is the only sensible thing to do.

Nvidia on the other hand has had a great couple of years and as such can spend more money on marketing, R&D and bringing its' own standard (CUDA) to the table. However: with Nvidia losing its' chipet market, not getting an X86 licens and therefore missing out on the CPU/GPU fusion, it will have to find another market. In this case: HPC.

I don't think CUDA has much of a broad future though. You can't have code running only on specific products when the products in question don't dominate the market. If a developer writes an application running on 50% of the GPUs out there, 50% wont be able to run it at all. If you use a standard that any system can run, well, you got a lot more potential customers. And more customers mean more money.

When AMD gets back in black, I'm sure we'll see more marketing and more focus on developer relations.
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Intel won't be able to ignore AMD's APUs. That's why Intel has Sandy Bridge and is adding their Larrabee technology to thier "HD" IGPs.

AMD's APUs will be used in laptops and could find a nice place in Netbooks.

Either way both Intel & AMD are pushing Nvidia almost compleatly out of this market.

As for the GPU accelrated PhysX, if you don't have a SLI set up or a dedicated PhysX card it kills your frame rates.

Havok has bigger marketshare than PhysX and because Havok is own by Intel you can bet it will never use Cuda for GPU accleration. Yet ATI has good reltaionship with Havok so they'll be using either OpenCL or DirectCompute. Whichever will better benefit Intel's IGPs and the other "Larrabee" projects that are still active.

Adobe will be moving over to OpenCL by CS6 so that will be a blow to Cuda. Microsoft is also using thier own Direct2Draw API for GPU hardware accelration and not Cuda.

For every great move Nvidia made they tripped many times over. GPU hardware accelration is slowly taking off. But in the general public software realm it's not using CUDA.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Intel didn't have a choice, MS was pressuring them. Please give credit where it's due. AMD was the main drive behind it, period.

That isn't the point I was making.
The point I'm making is that if AMD introduces an instructionset that Intel doesn't back, it is going to fail.
It isn't about credit. If you want, I can give credit to AMD for being the main drive behind 3DNow! aswell. Who cares about credit, when the technology is a failure anyway?
And why do you demand that people give credit to every technology they mention everytime?
If you want to play that game, it's about time to give nVidia credit for putting GPGPU solidly on the map with Cuda, and any developer will agree that C for Cuda was the 'inspiration' for the OpenCL API design. In other words, if it wasn't for Cuda, we wouldn't have OpenCL today.
Or how about giving Intel credit for inventing the microprocessor, and defining all x86 instructionsets and extensions, which AMD could base their x86-64 on. Happy with credit now?

You certainly have knowledge but you also tend to lean against AMD. Even an idiot can see it in this thread.

People tend to see what they want to see.
As I said many times before, my current videocard is actually an AMD Radeon. So am I against AMD? Not really. I am just critical of them, as I am of any company.
 
Last edited:
Status
Not open for further replies.