return to tranceaddict TranceAddict Forums Archive > DJing / Production / Promotion > Production Studio

Pages: 1 2 3 4 5 6 7 8 9 [10] 11 
Apple i7 laptops, anyone know when? (pg. 10)
View this Thread in Original format
DJ RANN
Agreed (even though you were the first person in this thread to bring up dell after the OP) ;)

and oh yes, my arse is way too tight for that becuase unlike you two, I've never had anything shoved up it :toothless :tongue3 :p
DJ RANN
Just heard from a friend that used to work at a majory B2B chip distributor that Apple have been developing a new multicore chip that has intel scrambling for ideas.

The chip is tailoired to osx and the production models they currently have (not prototypes) are benchmarking at a minimum of four times the sustained speeds of the newest intel protoypes. and they don't need fans. :eyes:

The only downsides are that they will be expensive, not available until 2011 and require a completely new version of OSX.

So here's the next bit of news:

AMD are working on a retail release of a 48 core(!) chip - which are going to make both intel and apple re-evaluate their proposed chip prices for the next year.

And allegedly the hexacore i7 980x mac pro (3.6ghz) is going to be released in june......and apple are putting two of these in there meaning 12 core mac pro.

But that's if you can't wait for the intel lightpeak cpus :eyes:
Fledz
That's interesting news, especially since Apple moved to the Intel chips a while back.
I think it's good because competition is essentially great for the consumer.
Of course it's going to be expensive, it's Apple :p However, if they overprice it too much then they risk losing the people who aren't too fussed about what OS they use so they will have to be careful with that. 2011 is also a long time away and chances are both Intel and AMD will have new chips by then too. Intel already has prototypes for new ones, though I guess they always do.

To be honest, I think Intel will be the winners because I believe their direction is the best one. Instead of cramming as much processing power and cores in as possible, they are instead trying to focus on efficiency and hyperthreading. A 48-core chip will probably need like a 2000W PSU :nervous:
evo8
quote:
Originally posted by DJ RANN
Just heard from a friend that used to work at a majory B2B chip distributor that Apple have been developing a new multicore chip that has intel scrambling for ideas.

The chip is tailoired to osx and the production models they currently have (not prototypes) are benchmarking at a minimum of four times the sustained speeds of the newest intel protoypes. and they don't need fans. :eyes:

The only downsides are that they will be expensive, not available until 2011 and require a completely new version of OSX.

So here's the next bit of news:

AMD are working on a retail release of a 48 core(!) chip - which are going to make both intel and apple re-evaluate their proposed chip prices for the next year.

And allegedly the hexacore i7 980x mac pro (3.6ghz) is going to be released in june......and apple are putting two of these in there meaning 12 core mac pro.

But that's if you can't wait for the intel lightpeak cpus :eyes:


:haha:
DJ RANN
quote:
Originally posted by Fledz
That's interesting news, especially since Apple moved to the Intel chips a while back.
I think it's good because competition is essentially great for the consumer.
Of course it's going to be expensive, it's Apple :p However, if they overprice it too much then they risk losing the people who aren't too fussed about what OS they use so they will have to be careful with that. 2011 is also a long time away and chances are both Intel and AMD will have new chips by then too. Intel already has prototypes for new ones, though I guess they always do.

To be honest, I think Intel will be the winners because I believe their direction is the best one. Instead of cramming as much processing power and cores in as possible, they are instead trying to focus on efficiency and hyperthreading. A 48-core chip will probably need like a 2000W PSU :nervous:


Yeah, I think the comsumer is going to be the one that benefits most here. I feel that things (moores law) somewhat stagnated over the very recent years. I rememebr back in 2001 building a custom audio PC and month on month there were large jumps in CPU speeds becuase of the war between AMD and intel.

I think anyone is going to have a hard time beating intel in long game becuase of how well they are setup and their massive infrastrcuture. Unless someone pioneers some new technology that intel just can't get it's hands on, I reckon intel should win out.

I do think it's good though that there's going to be some proper competition from both Apple and AMD as AMD kind of dropped off the map a little and apples hookup with intel made it a bit of a monopoly.

this lightpeak is interesting though, they're saying USB 3.0 could be bypassed because of it:

http://www.youtube.com/watch?v=khPx1dEIPnA
Fledz
Yea, Lightpeak could mean we could get rid of half the cables. I would very much like that.
kitphillips
I think your both talking hype.

RANN, Nvidia and Intel have both been working to bridge the gap between the GPU and CPU type devices for a couple of years now. Building CPUs with the vast number of cores like a GPU but the power and instruction set of a CPU is nothing new really. GPGPU technology will take advantage of this sort of thing quite soon anyway, there are alredy plugins that do it. Basically what your saying is that AMD is building a graphics card which thinks its a CPU. Cool, but not groundbreaking IMO.

In terms of Apple's apparent new solution, well, its an apple rumour, so we'll take it with a gram of salt. I'd be curious to see it, but if it don't run windows then its not of much use to me tbh. Apple's blasted ahead recently due to two things, the ipod/iphone which has raised their image in the eyes of the mainstream user, and the fact that almost all windows programs now work on mac, and windows can be booted on mac hardware. If they change the away from the intel architecture, it'll make life harder for developers to develop for windows and mac at the same time, and also make it impossible for windows to run on mac, which will curtail their market share by making it harder for windows users to transition smoothly to mac.

As for lightpeak, well, USB 3 is more than fast enough for most needs, and will probably be the standard for the next 5 years at least. Lightpeak's great but will have slow adoption due to a lack of backwards compatability.
DJ RANN
quote:
Originally posted by kitphillips
I think your both talking hype.

RANN, Nvidia and Intel have both been working to bridge the gap between the GPU and CPU type devices for a couple of years now. Building CPUs with the vast number of cores like a GPU but the power and instruction set of a CPU is nothing new really. GPGPU technology will take advantage of this sort of thing quite soon anyway, there are alredy plugins that do it. Basically what your saying is that AMD is building a graphics card which thinks its a CPU. Cool, but not groundbreaking IMO.

In terms of Apple's apparent new solution, well, its an apple rumour, so we'll take it with a gram of salt. I'd be curious to see it, but if it don't run windows then its not of much use to me tbh. Apple's blasted ahead recently due to two things, the ipod/iphone which has raised their image in the eyes of the mainstream user, and the fact that almost all windows programs now work on mac, and windows can be booted on mac hardware. If they change the away from the intel architecture, it'll make life harder for developers to develop for windows and mac at the same time, and also make it impossible for windows to run on mac, which will curtail their market share by making it harder for windows users to transition smoothly to mac.

As for lightpeak, well, USB 3 is more than fast enough for most needs, and will probably be the standard for the next 5 years at least. Lightpeak's great but will have slow adoption due to a lack of backwards compatability.


Not so sure it is hype in the long run. While I'm certainly not going to run out and buy shares in lightpeak or a4 chips, the fibreoptic solution is going to be the future.

I work with some major audio/voice/com installers that do all the major post production, music and film studio here in hollywood and all of them are dropping copper cable in favor of fibre for their main infrastructure.

Many didn't even bother with HDMI (dumb handshake bull, distance limitation, new connector, etc.) and just bypassed it.

The same will happen with computers, as the bottleneck increasingly becomes the bus speeds and data transfer rate between componenets not the compenents themselves.

GPU technology is not going to be the answer (apart from for gaming machines)as processors are being made and will able to take advantage of 64 bit platforms.

AS for USB 3, there's talk of incorporating the the lightpeak connector in to a USB connector meaning during the crossover period of these technologies, as long as the socket has one or the other system you can have either system.

USB 3 is certainly faster enough for most things but with the jump in video formats (to 4k as the digital standard), not to mention the audio that goes along with it (7 channels) the lightpeak standard is the only real option that offers any longevity. People are going to get fed up (and some already are) of having to adopt new standards that only offer minimal increases (usb 1 to USB 2 to FW 400 to FW 800 to USB 3 etc.).

But having said that I agree USB 3 will take hold for the time being as there's already retail USB 3 products and lightpeak is still a prototype.
kitphillips
quote:
Originally posted by DJ RANN

GPU technology is not going to be the answer (apart from for gaming machines)as processors are being made and will able to take advantage of 64 bit platforms.


Hmm? Not sure what your saying here? Looks to me like GPU and CPU technologies are converging, so we're going to wind up having two chips, but they'll probably be able to do the same things.

Agreed on the rest though. For the short term it is hype but in the long term, I'm sure in 10 years, fibre will be the standard for everything pretty much.
echosystm
quote:
Originally posted by DJ RANN
GPU technology is not going to be the answer (apart from for gaming machines)as processors are being made and will able to take advantage of 64 bit platforms.


Explain this bit about "64 bit platforms". I don't understand what you are trying to say.

Regardless, GPU and CPU functions will be increasingly converged. The reason behind this is because typical GPUs are far better than CPUs at floating point arithmetic, so there are significant performance gains from offloading that processing to a GPU. For example, I'd say an average quad core CPU probably pulls around 60 gigaflops (60 billion floating point operations per second), whereas a high end video card probably does around 300. This is obviously huge to us musicians, since all audio stuff is floating point these days. Having the CPU and GPU integrated also obviously alleviates pressure on "bus speeds", since there isn't really a "bus" in the traditional sense. I'm not saying this is going to be the long term solution, but it is the immediate solution and not just for gaming.

quote:
Originally posted by DJ RANN
Just heard from a friend that used to work at a majory B2B chip distributor that Apple have been developing a new multicore chip that has intel scrambling for ideas.


Apple don't have the resources to go up against AMD, let alone Intel. I call BS.

DJ RANN
quote:
Originally posted by echosystm
Explain this bit about "64 bit platforms". I don't understand what you are trying to say.

Regardless, GPU and CPU functions will be increasingly converged. The reason behind this is because typical GPUs are far better than CPUs at floating point arithmetic, so there are significant performance gains from offloading that processing to a GPU. For example, I'd say an average quad core CPU probably pulls around 60 gigaflops (60 billion floating point operations per second), whereas a high end video card probably does around 300. This is obviously huge to us musicians, since all audio stuff is floating point these days. Having the CPU and GPU integrated also obviously alleviates pressure on "bus speeds", since there isn't really a "bus" in the traditional sense.


If you call converging dropping the discrete GPU's then I agree. Look at the way macbooks and lesser expensive laptops have now evolved. They used shared graphics (as opposed to dedicated GPU's) and don't be fooled in to thinking they're integrating a GPU - they're just offloading it in to the CPU. It's a ing copout.

This is where gamers don't seem to get the idea about audio. Video hardware is built with drivers that allow proper utilization for dedicated hardware, but audio is reliant on the CPU and system as a whole (obviously apart from dedicated DSP) and therefore far more prone to overall system bottle necking.

until they start acting the same way with audio, the video GPU way of doings things is sadly non relevant to us. The only advantage right now is that some processor power is freed up by having a separate GPU to handle video.

Not really a huge advantage unless you can tell me something I"m missing?
echosystm
quote:
Originally posted by DJ RANN
until they start acting the same way with audio, the video GPU way of doings things is sadly non relevant to us.


The whole point of "converging" the technologies is to allow GPUs to take on general FLOPS and not be limited to video-specific operations. This is the way things ARE going. Isn't that what we are arguing? I'm so confused.
CLICK TO RETURN TO TOP OF PAGE
Pages: 1 2 3 4 5 6 7 8 9 [10] 11 
Privacy Statement