• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Parallella - $99 16-Core Raspberry-Pi-On-Steroids Kickstarter (85% funded, 30hr left)

Status
Not open for further replies.

Thraktor

Member
Kickstarter page here.

As I didn't find a thread on this, I thought perhaps some of the coders on here might be interested in getting on board before the kickstarter finishes.

Parallella

Probably the best way to describe Parallella is as some sort of crazy mutant steroid-infused version of the Raspberry Pi. Although it's not related to the Raspberry Pi, it's similar in that it's a credit card sized computer with an ARM CPU, but it's also got quite a bit the Raspberry Pi doesn't (and a higher price point to go with it). For $99, you get a board with a Xilinx Zynq7010 CPU, 1GB of RAM, a 16 core Epiphany processor, and the usual SD card, Ethernet, HDMI and USB ports. The Zync CPU is a dual-core ARM9 running at 800MHz with one notable difference; it's got a FPGA (with 28,000 logic cells) on-chip with it. While that won't mean much to non-techies, FPGAs (field programmable gate arrays) are normally very expensive devices, so this is as cheap as you're going to get if you want to get your hands on some FPGA hardware (although making use of the FPGA is not for the uninitiated).

Epiphany Processor

The whole point of the project, though, is the Epiphany processor. Epiphany is a multicore architecture of basic RISC CPU cores with a high-bandwidth mesh interconnect between them. It scales from 16 cores (which you get on the $99 board) to 64 cores (you can get an engineering sample of which on a board for $750) and all the way up to 4096 cores in the future. The big advantage of this kind of architecture over something like, say, a GPU, is that you're working with proper C-programmable CPU cores, so you get proper task-level parallelism (ie you can have each core run completely different tasks on different data, whereas on GPUs you're limited to performing the same task on a set of data). The design's also very energy-efficient, with the board drawing 5 watts and even the 64 core chip drawing only 2 watts (while competing with server CPUs in certain benchmarks). There are a few disadvantages, particularly as far as memory is concerned, as the cores have no cache and only 32KB of local data memory each on current chips, so it's best suited to applications either with low memory requirements, or where memory can be easily managed. Nonetheless it's a really interesting design for those of us who wish to mess around with a properly "many-core" architecture.

What Can You Use It For?

Well, you could just ignore the Epiphany chip and use it as a cheap PC (it runs Ubuntu), but that'd be a bit of a waste. There are quite a few plans for different applications for it, like bitcoin mining (which it certainly should do well at from a bitcoins/watt ratio), running a Minecraft server, Folding@home, media transcoding, etc. There are some demonstrations already of it running computer vision code, and it could potentially do some very interesting things with a Kinect attached. With the FPGA involved, there are some interesting applications as a software-defined cognitive radio (ie performing wireless communications with the ability to change frequency on the fly to maximise bandwidth and avoid interference), although that's not necessarily something an end-user would make use of. Personally, I just want one to mess around with and see if I can't come up with some interesting ideas while learning a bit about parallel processing (my first project will be an inevitably shoddy path-tracer), and maybe going crazy at some point and trying to reconfigure the FPGA. Plus, by backing it I'm making it a lot more likely that at some point they'll be able to offer 64+ core boards at a reasonable price point, at which point things get really interesting.
 

Thraktor

Member
No-one's interested? For what it's worth, the project's now funded to $713k (95% of the $750k goal) with 24 hours left, so it does look like it'll be funded at this rate, although in last-minute fashion.

If anyone sees this who is interested, here's a link to the architecture reference manual for the Epiphany processor, which might make for enlightening reading. There's also a roadmap for their future plans for the architecture:

image-168911-full.jpg


To illustrate how feasible their plans for a thousand-core chip by 2014 are, consider that their current 64 core chips at 28nm have a die size of just 11.5mm² and draw 2 watts, compared to a high-end GPU like the GTX680, which occupies a 294mm² die and draws 195 watts. The architecture is inherently scalable, so there's a lot of room for increases in the number of cores even without considering die-shrinks.
 
This stuff sounds cool, but as I'm relatively inept when it comes to computers I'm afraid I don't see how I could take advantage of this.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
No-one's interested? For what it's worth, the project's now funded to $713k (95% of the $750k goal) with 24 hours left, so it does look like it'll be funded at this rate, although in last-minute fashion.

If anyone sees this who is interested, here's a link to the architecture reference manual for the Epiphany processor, which might make for enlightening reading. There's also a roadmap for their future plans for the architecture:

image-168911-full.jpg


To illustrate how feasible their plans for a thousand-core chip by 2014 are, consider that their current 64 core chips at 28nm have a die size of just 11.5mm² and draw 2 watts, compared to a high-end GPU like the GTX680, which occupies a 294mm² die and draws 195 watts. The architecture is inherently scalable, so there's a lot of room for increases in the number of cores even without considering die-shrinks.

So this is something I should buy if I want to write games for it?

Is it a cpu + cpu bundle? I'd love to get my hands on the piece of pie.

GTX680 is known for low-power consumption.
 

Thraktor

Member
This stuff sounds cool, but as I'm relatively inept when it comes to computers I'm afraid I don't see how I could take advantage of this.

Yeah, as it seems very likely to get funded at this point, I'd advise waiting it out. If some interesting software appears for it after coders start to get their hands on it, then you can pick one up at that stage without having to worry about buying something you're never going to use. At this point I'd only really recommend it to people who want to write software for it, rather than people who just want to run other people's software on it.

So this is something I should buy if I want to write games for it?

Is it a cpu + cpu bundle? I'd love to get my hands on the piece of pie.

GTX680 is known for low-power consumption.

I don't think it's really suited for writing games on, as there's no GPU to speak of (although you could fashion some kind of GPU functionality out of the FPGA and Epiphany chip, that would be far from beginner-friendly). It is a full system on a board, though, as it's got an ARM CPU, a gig of RAM, USB, HDMI and SD ports and so forth, with the Epiphany chip acting as a co-processor for the ARM CPU.

As far as comparisons go, the GTX680 claims 15.85 Gflops/Watt, whereas the 64 core Epiphany claims to hit 50 Gflops/Watt, so it's certainly a very energy-efficient design, although of course theoretical Gflops only tell part of the story, and with different architectures they're suitable for different tasks. The 16-core version you get on the $99 board isn't quite as efficient, as it's manufactured on a 65nm process, but it's still a fairly negligible power draw of 5W for the whole board.
 

usea

Member
As interested as I am in parallel programming, I don't know if I can justify dropping $100 for something I might get a first run version of at some point down the road. When it launches I'll wait for some reviews and then buy one.
 
Also been keeping tabs on this these past few weeks, pretty impressive how close they've gotten to hitting the minimum funding goal despite only just getting a "proper campaign" level of details/polish on their pitch a couple days ago.

That they are pushing so hard for people to hack around on the thing, open sourcing nigh unto everything, and generally just seem a bit loopy in a good way---really seems like one of those Kickstarter projects that stands to spark off some crazy things down the line following the wake.
 

Thraktor

Member
It's now fully funded (actually at $834k with 5 hours to go), so anyone who's backed can expect their board in May (or earlier for some of the more expensive pledges).

Also been keeping tabs on this these past few weeks, pretty impressive how close they've gotten to hitting the minimum funding goal despite only just getting a "proper campaign" level of details/polish on their pitch a couple days ago.

That they are pushing so hard for people to hack around on the thing, open sourcing nigh unto everything, and generally just seem a bit loopy in a good way---really seems like one of those Kickstarter projects that stands to spark off some crazy things down the line following the wake.

Yeah, that's what I'm hoping for, and I have the feeling that some applications that I never even would have considered will pop up. And if not, at least I've got a nice little device to learn a bit of parallel programming on.
 

SMT

this show is not Breaking Bad why is it not Breaking Bad? it should be Breaking Bad dammit Breaking Bad
Yeah, as it seems very likely to get funded at this point, I'd advise waiting it out. If some interesting software appears for it after coders start to get their hands on it, then you can pick one up at that stage without having to worry about buying something you're never going to use. At this point I'd only really recommend it to people who want to write software for it, rather than people who just want to run other people's software on it.



I don't think it's really suited for writing games on, as there's no GPU to speak of (although you could fashion some kind of GPU functionality out of the FPGA and Epiphany chip, that would be far from beginner-friendly). It is a full system on a board, though, as it's got an ARM CPU, a gig of RAM, USB, HDMI and SD ports and so forth, with the Epiphany chip acting as a co-processor for the ARM CPU.

As far as comparisons go, the GTX680 claims 15.85 Gflops/Watt, whereas the 64 core Epiphany claims to hit 50 Gflops/Watt, so it's certainly a very energy-efficient design, although of course theoretical Gflops only tell part of the story, and with different architectures they're suitable for different tasks. The 16-core version you get on the $99 board isn't quite as efficient, as it's manufactured on a 65nm process, but it's still a fairly negligible power draw of 5W for the whole board.


Thank you Thraktor, now I know all I need to know thanks to your detail.
I'd back you.
 

Thraktor

Member
interested in what this sort of architecture could mean for cgi

if you could get these cores rendering individual frames concurrently, o gawd muh balls.

In its current form, the Epiphany chip will be rather poor at rendering, due to the limited on-chip memory. That said, my plan for the board is to put together a path-tracer, although it'll be a very simplistic one at best, and more a proof-of-concept than anything else.

In the long term, improvements in the amount of on-chip memory (particularly from 3D stacking) might make it a decent architecture for ray- and path-tracing, but my guess is it'll continue to be tricky from a programmer's perspective due to the lack of any kind of cache.
 
Status
Not open for further replies.
Top Bottom