3

Pretty much exactly what it says in the title. I'm looking to buy a new computer cause I'm tired of spending 2 hours waiting to edit a point in ArcView and I was curious if either ArcView or R uses GPU acceleration. I plan on getting the Intel i7 3770k processor (and using its onboard Intel HD 4000 graphics card) and a Samsung SSD to speed things up with a decent amount of RAM. But when I posed the question to knowledgeable computer users that don't use GIS or R they asked about the GPU and if either program uses GPU acceleration. As far as I can tell they don't, but I figured I would go for a more knowledgeable second opinion.

Thanks in advance,

HeidelbergSlide
  • 293
  • 3
  • 13
  • The [Task View: High-Performance and Parallel Computing with R](http://cran.r-project.org/web/views/HighPerformanceComputing.html) might be of interest. I believe at this point the processor and RAM are more important for using R than the GPUs, but that could change in the next years. – Roland Aug 25 '12 at 17:51
  • Ok, looks good to me. I'll read through that in a bit more detail but it looks like the current set up will suffice for the next few years. Can always add a graphics card later on. Thanks! – HeidelbergSlide Aug 25 '12 at 17:57

2 Answers2

3

See this thread and the ArcGIS site. ArcGIS does not support GPU calculations (as opposed to Manifold), so a basic 3D card is fine. Some extra RAM and SSD are the obvious choice here.

In R, implementing/using parallel and GPU-supported computing is largely your own responsibility, i.e. not many out-of-the-box solutions exist yet (see the few packages under "Applications" in the HPC Task View - already mentioned in the comments). Unless you plan to start some serious GPU or CPU parallel programming, there's no need to buy anything special.

Community
  • 1
  • 1
ROLO
  • 4,183
  • 25
  • 41
2

In R you can use the gputools package to compute things on the GPU, but you'll have to brew up your own statistical algorithms. Maybe in a few years they will have been around enough that common algorithms will have GPU-based implementations.

Ari B. Friedman
  • 71,271
  • 35
  • 175
  • 235