Once again - too expensive. Even more "too expensive" than the previous one. Not that there can be a direct comparison between market niches, but for that amount of money you can get a x86 itx board with a quad core i7, 16 gigs of ram and a decent gpu.
Other vendors sell dev/eval boards below their actual value in order to hook up OEMs to use the platform in their products, and then there is nvidia, too big (or small) to miss on the opportunity to rake in profits on something as limited in volume as dev boards.
Way to go nvidia, I mean winning designs for a chip that costs like 15$ to make and sells at 400$ to OEMs... That would actually explain why nvidia socs are nowhere to be found in consumer products.
You appear to have missed the "Not that there can be a direct comparison between market niches" part.
BTW this product is marketed for automotive. Not mobile phones or tablets, where battery life is critical. Vehicles which have kilowatts of power at their disposal, which would have no problems running more power hungry platform.
So while an i7 itx platform would probably not use 15 watts, it will offer much better performance and value for the price and power consumption. And being more power hungry would not be detrimental if you put it in a car. So no, fitting in a 15 watt power budget is not a big whoop, and certainly doesn't justify the ridiculous price.
So why make your comparison, if you admit it's not possible?
By all accounts there's a market for this board; it is being purchased and it is being used. That suggests the pricing is correct. If it were cheaper it would likely not leave enough margin for future R&D to continue moving the state of the art onwards.
Where you're wrong, that there is no direct comparison. There is in form of Atom boards, but they're a bit underpowered. But then again, there is literally copetitor in form of Intel Gordon Ridge, which is nearly ready made into automotive format, with ready-to-install kits.
And even there, i7, or really any architecture beyond Haswell easily scales from 217 to 1.3W, even on same chip.
As long as you can create robust silicon with abundance of GPIO/can and decent software, anything is open game.
Also main limitation of automotive isn't power (which is important in standby, by the way, no one likes to discover that your car battery has gone flat in a week you haven't been driving your vechicle), but cooling. Leaving a car in 50°C/120°F for a day shouldn't damage the thing.
Power isn't a problem in automotive, what is a problem is heat. In automotive you have to validate a design for every conceivable environment. This includes days of -40°F and 120° temperatures, with a hot engine running and limited airflow inside the dash. You think a desktop i7 is going to run in a cramped dash with no airflow? Is it going to last for the life of the vehicle? The nVidia solution is much better because it keeps the TDP low, performance relatively high, has soldered components, and may even require no active cooling in production versions. That removes or mitigates dozens of failure modes.
This isn't for your everyday computing. It's not meant for tablets, desktops, or anything of that sort. This is specifically developed for 'edge cases' in AI where it becomes desirable to shift processing on-location and not be constantly sending data back and forth to a data center to make sense of anything. Your example of an x86 board, with a quad core i7, 16 GB of RAM, and a 'decent' GPU would take up at a minimum, ~6.64 times more space (assuming mini-ITX), consume untold times more energy, and certainly isn't going to fit in anything like a traffic cam, like this is designed for. I'm gonna have to assume you're trolling, because even there's no way anybody would write such an asinine comment without doing it on purpose.
I am going to assume you are an nvidia fanboy, who has never seen an itx board in real life, because if you had you'd realize it is not that much bigger than this dev board. And judging by laptop mobos, i7 can fit into much smaller spaces and into much more modest power budgets. Amazing, I know.
Untold times more energy? Maybe numbers beyond 4 boggle your mind too? There is no magic to this, it is not intrinsically superior, the computing performance it delivers is in line with the power it consumers. A vega based gpu has 25 tflops of fp16 throughput at below 250 watts, meaning less than 10 watts per tflop, whereas this thing gets its tflop at 15 watts, while costing as much as solution that is 25 times more powerful. You mean to tell me that if amd were to chop off 1/25th of a vega to get a solution that does 1 tflop per 10 watts of power budget, that would be such an achievement that it will justify asking the same price for this 1/25th as for the 25 times more powerful solution?
It is not a miracle product. It's power requirements are well in line with the performance it offers. It consumers little power because it doesn't have too many transistors and it doesn't offer that much performance. Its power requirements do not justify its ridiculous cost. It should cost like 50 bucks for the module, 75 for the dev board.
I don't understand, your common sense seems to have gone out the window. AMD Vega performance? It's not been released, we don't know what it is. Chop off 1/25 of a Vega GPU? What about the chipset, the CPU?
Elsewhere you talk about i7 and cheap dGPUs. Are you seriously suggesting that you can get more perf/W in a 15W TDP package using x86 and non-latest-gen GPU uarch?
It's not a miracle product, but these applications specifically require a higher ratio of GPU:CPU power, making this product superior in those places. There's a reason they've released this and there's a reason that intel boards in this area always have dgpus. It's absolutely essential and if AMD has really caught up to Intel, the Intel piece will be entirely overpriced.
The way ARM has improved, Intel's presence in any embedded mobile market is probably temporary.
Yes I have seen an ITX board, which is why I actually did the maths and gave you a number. A mini-ITX motherboard is 6.64 times the size of the TX2. And no, I'm not an NVIDIA fanboy, considering the last product I owned from them was the GeForce 6200 over a decade ago.
You're comparisons are becoming more and more ridiculous. You can't just 'chop' off 1/25th of an UNRELEASED chip and expect to achieve the same results in efficiency, never mind the fact that you've just conveniently ignored the complete lack of CPU in your 10W example. Speaking of lacking, where do all the other components fit into your imaginary board? ac WIFI? Bluetooth? 8GB LPDDR4 RAM? 32GB storage? I assume in your wonder board, these don't cost any money and don't consume any power.
Intel GPUs, especially in atom based SOCs, are rubbish they always have taken more die area/performance, very low utilisation efficiency, poor power efficiency, non-existent AI capability (no efficient FP16 or INT8 capability) and are very imbalanced designs.
Pretty sure there's nothing else on the market that can do what this thing can do. I can say that with confidence because if there were it would be used in self-driving car platforms. "Not that there can be a direct comparison between market niches, but for that amount of money you can get a x86 itx board with a quad core i7, 16 gigs of ram and a decent gpu." For that amount of money one can also get 100 custom made t-shirts. What's your point?
Costs $15 to make? Are you talking about manufacturing costs? Firstly maybe the SoC alone costs $15 to make(?) I'm guessing the whole board is a lot more than that. But so what? It's part of a multi-billion dollar R&D effort spanning software and hardware.
15$ for the chip, 30$ for the module, 60$ for the dev board. Those would be realistic production costs, I mean how much it costs them to make it. What justifies asking 10 times more than its worth?
"Multi-billion" R&D costs are covered by selling a lot of them chips at reasonable margins, not by selling a few but ridiculously overpriced. And if it really did cost "multi-billion" to R&D such products, they nvidia would be bankrupting itself as it hasn't got anywhere near that amount of revenues from automotive since they got into, and it would eat most of their revenues from other market niches. Which would conflict with their financial results, which indicate that the actual R&D cost for those products in far from "multi-billion" and more like "several-million". And that's the beauty of IP - once you do the design, you can churn millions upon millions of products based on it, which reduces the R&D cost per product to almost negligible. Especially when the IP is not something brand new from scratch but barely incremental update of a previous design.
Quite frankly, while it is understandable why I would criticize the pricing, not only from my perspective, but also from the perspective that they wouldn't be able to win many designs at that cost, it is quite curious why individuals like you rush so desperately to defend it? Why are people like you acting as if I have said something bad about your mommas?
I'm at a loss over why people are crying off over the price. Do you want one, but can't afford it? Or do you fear for the budgets of the University of Oxford and Tesla?
Or, you know, they'd like to see many companies succeed so technologies are more competetive. If no one cared then no one would be reading a pointless article on a pointless site.
This is not a consumer product. It is not the kind of thing you "want one from". I could put this this to a good use, and buy tens of thousands, had it been worth the money. At this price it is not.
Once again, what's more puzzling here is why people like you defend the ridiculous pricing with such dedication. And sure, some big OEMs with massively overpriced products could afford it. A billionaire could afford to blow his nose on 100 dollar bills, but do the do that?
""Multi-billion" R&D costs are covered by selling a lot of them chips at reasonable margins, not by selling a few but ridiculously overpriced. And if it really did cost "multi-billion" to R&D such products, they nvidia would be bankrupting itself as it hasn't got anywhere near that amount of revenues from automotive since they got into, and it would eat most of their revenues from other market niches."
The multi-billion dollars spans all their products. Though there are some additional hardware and firmware costs associated specifically with Tegra and specifically with Jetson, and there is similarly software development that covers Jetson but not, for instance, gaming.
"And that's the beauty of IP - once you do the design, you can churn millions upon millions of products based on it, which reduces the R&D cost per product to almost negligible."
It's not even close to almost negligible! You can't just conveniently divide their costs the way you see fit for your rant of the day.
"15$ for the chip, 30$ for the module, 60$ for the dev board. Those would be realistic production costs, I mean how much it costs them to make it. What justifies asking 10 times more than its worth?"
The price of the board isn't based on cost. The cost limits how low of a price they can charge but it doesn't set the price. The price is based on value. NVIDIA is able to charge the money it can charge because of the value their product provides. That's how money is made, by providing a good or service with a value that is greater than the cost to provide the good or service. The fact that you think that the cost to produce the board (whether you figure in the r&d and sg&a costs or not) is what the board is "worth" shows that you don't understand the very basics of business or economics.
"Quite frankly, while it is understandable why I would criticize the pricing, not only from my perspective, but also from the perspective that they wouldn't be able to win many designs at that cost, it is quite curious why individuals like you rush so desperately to defend it? Why are people like you acting as if I have said something bad about your mommas?"
This is a public message board and when we see you say something stupid we point it out. And the things you have said are quite stupid in a basic economic sense. Don't get mad about it. Educate yourself. As far as getting more design wins, it's interesting (i.e. laughable) that you think you can judge the supply/demand curve of the developer board market better than the company themselves can. But from the things you've said you most likely don't have any idea of what a supply/demand curve is or what it means. You're just basing your judgments on thin air. If you think a product is worth its cost to produce then a supply/demand curve is useless to you. Of course if you were making the board I hope eating would be useless to you as well, as well, but maybe you'd add eating cost into the worth of the product. If you wanted a shiny new Ferrari I guess you'd have to add that into the developer board's worth, too.
You are wrong, they're not saying that they should sell at cost, but that with R&D included their cost is maybe 120 all in per dev kit. When they ask 600 for it, that's grossly overpriced and their marginal cost will not be equal to their marginal revenue.
Wrong. That's branding and it's specifically designed to target different users/audiences. I know nothing about you but, assuming you are a mortal as I am, these things weren't designed to target you or me. The massive dev board of the TX1/2 is completely useless, if you have to install any peripheral via usb, like a camera, you end up with just one spare micro-usb port, which means you need to buy an extra usb hub in order to connect mouse-keyboard as a minimum. You then have to invest another 300-500 € to buy a dedicated carrier board if you want to take full advantage of the jetson TX1 size. If you don't believe me, go and check the prices of carrier boards sold by auvidea or connecttech. On top of this, support is virtually nonexistent, and the community is virtually no one. You are left hanging with all sort of questions for days/months, the so-called advantages of, for example, connecting several cameras to the Jetson is virtually a wet dream: there are no drivers available and if you want the only ones in existence you've got to pay another €2500. If you compare this thing with a widely use Raspberry pi in terms of community/support and even hardware usability, I can take this thing will never take off.
Dude, what are you even talking about? What is 2.5k if you are designing a car and need a hand full of these devices boards? These prices are perfectly fine for the niche this board is aiming at. It isn't aiming at home users playing with raspberry pi boards...
Oh so you are a stupidity pointer out, are you. That's quite amusing, considering your post is riddled with stupidity and general lack of understanding of the subject or even common sense. Unlike you I have many years of experience in that field, I know how much stuff costs, and how much stuff is worth.
nvidia is pretending to be intel of lately, they act all high and mighty and expect ridiculous margins. Which is why their mobile platforms are nowhere to be seen. They are aiming to milk one particular cow they know is easiest to milk, but they are not in the position to do so. This product in particular is for the time being unique, but that wont last long. We are few months away zen apus which would make this product irrelevant, at least at that price. With the added benefit you don't get locked into cuda, but get to use modern OpenCL which runs on a variety of other platforms, and can even be compiled to verilog and put directly into fpgas or even silicon.
Sure, they are making some money on automotive, but I highly doubt the actual reason for this is their products are worth the money, corporate interests and politics are at play there. Not logic, not reason, not the consumer's best interest.
"That's quite amusing, considering your post is riddled with stupidity and general lack of understanding of the subject or even common sense. "
OK. show me what I said that shows lack of understanding or common sense.
"Unlike you I have many years of experience in that field, I know how much stuff costs, and how much stuff is worth."
Saying it's so doesn't make it true.
"We are few months away zen apus which would make this product irrelevant, at least at that price. With the added benefit you don't get locked into cuda, but get to use modern OpenCL which runs on a variety of other platforms, and can even be compiled to verilog and put directly into fpgas or even silicon."
Yeah sure. Let's wait a couple years and see how that works out. Using OpenCL might be great if there were both strong hardware and strong libraries for it. Even if Zen APUs are competitive, AMD is going to leave it up to everyone else to create the development tools. It's something that takes years to do.
"Sure, they are making some money on automotive, but I highly doubt the actual reason for this is their products are worth the money, corporate interests and politics are at play there. Not logic, not reason, not the consumer's best interest."
Yup, everyone is stupid except you. NVIDIA, their customers... You just aren't able to say why. Must be some big conspiracy. If only the world wasn't corrupt you'd be a multi-trillionaire by now.
Yeah, today goldman sachs tells people to buy nvidia, to anyone who is not evil or retarded that means "do not buy nvidia".
Granted, nvidia has the tools, and the libraries, but that's just bait to lock in the lazies. Not everyone is lazy and talentless, not everyone needs to be held by the hand like a little baby.
I already have enough money to not really even care or think about money. That doesn't mean I outta be wasting it on overpriced, poor value stuff that is not worth it. Everyone is stupid, that's true. I am stupid too. Just less stupid than most. I am smart enough to know what I am stupid about. Unlike you ;)
Let me know when AMD has 8yrs of R&D and a few billion stuck in OpenCL development. They can't even properly launch a cpu (see reviews, games don't work right, SMT screwed, boards not even ready etc) or gpu (see last gen). If AMD doesn't actually start MAKING money at some point they're screwed. They have lost $8B+ in the last 15-20yrs. That's not good right? They've laid off 30% of their engineers in the last ~5yrs. They've been restructuring for 5yrs. The "tools and the libraries" are what you pay the extra dough for. Cuda works because they stuck a decade of development and cash into it. It's taught in 450 universities across a few dozen countries.
The point of the tools etc is smaller guys can get in. The point of using something like unreal engine is a smaller guy can make a credible game. You don't seem to get the point of all this stuff. not everybody has the time or money to develop an end to end solution (even larger companies buy qualcomm etc to get the modem and all in one for mobile etc) so part of the value of a device like this (or drive px etc) is all that you get on top of the device.
10yrs ago I would not have thought about game dev. It would have taken 10yrs to make a crappy game. Today on multiple engines (take your pick) I can make something worth paying for in a few years alone if desired. If you think that guy doing this is lazy or talentless you're dumber than you think ;) Sorry you're stupid. I'm ignorant about some stuff (cars, couldn't care less about learning them), but because I choose to be. But I'm not stupid. Comic you mention the stock, I'm guessing it will be $125-150 in the next year (under $100 today - $20 off in the last month). Auto's will at some point make them money on socs (and I think they'll re-enter mobile at 10nm or below as modem's can be added without watt costs etc), and AI/Big data will get them the rest of the way. Record revenue, margins, income will keep happening. Next rev of cards will probably be able to be priced another $50 across the board because Vega won't likely be able to do much against either Nvidia's new lineup of revved up boards with faster mem (GDDR5x on almost everything shortly and faster clocks across the lineup on top), or if that isn't enough we'll probably see 12nm Voltas for xmas if needed or at least Q1. Worst case NV just lowers prices until they put out the winner again just like Intel would do. Unlike AMD, both of their enemies can fight a price war for a year and win it next year. AMD will do better vs Intel (should get some expensive server chip sales) than nvidia. Intel has been paying so much attention to racing down to ARM they forgot about AMD even being alive. Nvidia hasn't done that and likely has multiple ways to play this out without a change in market share or much margin loss. Unlike Intel they never sat on their laurels. They've forged ahead and even taken the smarter/cheaper (GDDR5x) and much easier to produce route. HBM2 like HBM will be a problem for AMD going alone. If NV was in it maybe it wouldn't be expensive (NV could put what AMD is pulling), but alone they'll be killing profits just like last time and already late again just like last time giving NV more room to make adjustments.
It's comic AMD's slides compare HBM2 to GDDR5. That isn't what the competition will be using. They're going to be top to bottom GDDR5x shortly except for the bottom card. NV has the next 3 months to sell 1080ti and capitalize on top pricing then be able to cut if needed and not lose much having already milked the cow. Unfortunately for AMD, HBM2 held them up yet again (just like the first rev, not to mention will probably limit supply again just like HBM1). Benchmarks have shown Vega beating 1080 by 10%. Unfortunately it's now facing 1080ti (running like Titan) due to HBM2 just hitting production and delaying Vega. Lastly Raja said the driver team is almost entirely dedicated to Vulkan now: "I only have a finite number of engineers so they’re just focused on Vulkan." That means DX11 people, OpenGL will be left wanting. So even if Vega ends up 20-30% faster than 1080 in what they like (vulkan/dx12?), 1080ti will likely at worst tie it in most stuff and if needed they can always put out a card with 30 sm units on instead of 28 right (just a p6000 at that point right? Surely there are a few extras lying around)? Surely they have cherry picked enough titan chips by now that fully work "just in case" they're needed. I see them constantly talking 4k which again is ignoring the fact that 95% of us are running 1920x1200 or lower. Who cares if you win where nobody plays? They seem to be forgetting a full 50% of the market is running win7 and dx11 too. I won't likely be going win10 unless forced (2020? ROFL). There aren't enough games coming this year to make me want win10/dx12 and vulkan will run in win7. But I don't see a ton of Vulkan patches coming so far for older current games. Things could change but we'll see. I'd rather bet on what I can WIN today, not what you hope might happen one day. How long have people waited for bulldozer to be a winner? How long will it take for ZEN to get fixed on gaming? Will it ever? Since AMD themselves said already looking for gains on Zen2. PCper thinks games will look the same on Ryzen for good (so no faith in fixes in their opinion based on AMD talk).
Looks like we'll get two cards with about the same bandwidth, etc, but with NV having the dough to make drivers for all apis not just vulkan. Not doubting AMD will have great hardware, its the drivers that will likely keep them down. Raja himself said they're completely focused on Vulkan (so ignoring DX12, Dx11, OpenGL for now? Perhaps DX12 good enough?). Not a happy camper when AMD comes right out and says basically both products are short on R&D money. Now that we've seen 1080ti (just reading reviews...LOL). Board partners will make it even faster. Hope AMD can make enough vega to at least pull down some cash with it (meaning HBM2 limited again probably).
15$ to make it. Production cost, doh!. What it outta cost is 50$. Put the money on the table, and I will sell you as much from stuff that costs me 15$ for 50$ as you want.
Even with your prices it's $100 so 10x would be $1000. Also, tell your story to Intel who was losing 4.1B a year on mobile. Nvidia same story just less and selling far less. BTW, the soc likely costs more than $30 to make now maybe quite a bit more since they make nowhere near say apple: http://news.ihsmarkit.com/sites/ihs.newshq.busines... Apple's A10 is $27. I'm fairly certain NV's chips are above this since they are not likely making 50-100mil of them. In a document of a teardown of Xiaomi Mi 3 the soc cost was $27 and that was a LONG time ago. They are growing in size. IE, the upcoming Volta soc is expected to be 300mm. That isn't cheap to make. Consider AMD's gpu for ps4/xbox1 is not much bigger and costs 90-105 to make and they sell them for $100-110 (ps4/xbox1 respectively) upon release when they said they had single digit margins (now supposedly at mid-teens, which I take to mean not more than 15% and the Q reports and sales of both units back this up). The Volta chip is expected to be 7B transistors. Barely incremental updates? Even mighty Intel was losing 4.1B a year...LOL. A 14nm 165w Broadwell (24 core IIRC) has 7.2B transistors so you should be able to see the complexity here. For perspective the GTX 1080's die size is 314mm^2 and also has 7.2B transistors. So the new Volta Soc is about as big as GTX 1080's die. You don't just have to R&D it either, you have to pay to tape it out etc.
https://semico.com/content/soc-silicon-and-softwar... A $20 soc design is required to ship 10mil units just to break even on older tech. Getting more expensive now with coming 10nm. Not to mention upwards trend of software costs to go with it (69% cagr per shrink from 28 down to 7nm!). I don't think they're talking chips the size of Nvidia's either (nor a samsung/apple). More likely some chinese crap. "Total SoC design costs increased 89% from the 28nm node to the 14nm node and are expected to increase 32% again at the 10nm node and 45% at the 7nm node." They mention the number of low cost crap keeping Avg costs down. But that isn't Apple/Nvidia/Samsung's top socs. To date, Nvidia hasn't made a dime on their socs. That probably won't happen until the soc segment reaches 1B revenue. Last I checked they are FAR from that (about 500mil/yr). Since they sell software with the hardware (total solution for cars) it might be less than a billion needed to break even now but they still have yet to make money in this segment, so you're really not making sense here.
http://electroiq.com/petes-posts/2015/01/26/expone... Another showing costs blowing up. "McGregor said the tapeout costs to do a single device are very high. “You have to put $100 million into a semiconductor startup today to be able to get to productization.” This means that big companies will be getting bigger. “There will still be some small companies – but I think the mid-sized company in our industry, in devices, is going to dramatically go away because of the scale and other things required,” he said.
This crap isn't cheap. You should get the point. It might not be several billion, but they do spend about 1.5B a year in R&D now. They make a new IP and spread it over pro/datacenter/auto/gamer etc. If it wasn't for the others making money, the socs would have died by now. Auto etc is looking promising at some point but this is like spending 8yrs on Cuda to get it entrenched (and billions). I wish they'd make a 500w pc like box that would accept their discrete gpus (I'd hoped this would be rev2 of shieldTV but that hasn't come yet, maybe rev3?). I think they'd do much better offering a box that is far closer or even better (with discrete gpu) than xbox1/ps4 etc. It can't be that hard to put it in a much larger box and strap a heastink/fan on it and run it at 50-100w.
You don't just do IP and then churn out cheap design after cheap design. EACH design costs a bunch of money to make and support (software) before you even get it into a product. IE, in the article above the guy mentions a company needing 100mil to get to productization. Hence the smaller companies dying soon or never even starting up. He also mentions they get pretty much nothing from the software and now have far more software engineers than hardware which ends up just being the gift wrap around the chips as he says. It's not as cheap as you're making things out to be. Intel couldn't make a DIME on mobile for years (4B+ losses yearly until they gave it up).
16GB of DDR4 for that i7 is $85-100 alone, never mind the chip price etc...LOL. You seem confused about the price that Intel device would be selling for to make back all the design costs of everything in it and make profit. Jetson TX2 here isn't a raspberry pi. :) You aren't the target market, so get over it. Go buy a raspberry pi or get a better job so you can afford better toys...ROFL. This is being sold to car dealers/universities etc.
dev board have always been more expensive. why did you try to compare it with regular PC part that we can buy separately? Qualcomm dev board for snapdragon 820 (Open-Q 820) pretty much cost the same:
For an embedded development kit, the price is pretty reasonable. Besides that, these things are likely to sell to institutional and corporate buyers rather than individual tinkerers. Those organizations won't flinch at the price.
Who told you dev board is cheap? Almost every semi companies sell dev board several times more than it values, except those who gives away for free to grab market share. I just got a TI DLP dev board for $5000. Is it also over-priced? Yes! But, considering maybe hundreds of engineers worked on it, it worth it and it is the business.
Yeah, the fact that the Switch appears to not be using this chip but the X1 just rubs more dirt in.
Half-the-power in efficiency mode would have added an hour or two to mobile gaming operation without compromising the performance. And the docked mode would have been far better, and maybe Zelda wouldn't be dropping to 20fps...
Hah! My first thought was "damn, that's a cheap dev-board". Guess most commentators are used to buying consumer mass-production stuff, and aren't used for thousand-dollar dev-boards for single/double-digit-dollar ICs being the norm. You're not paying out the nose for the hardware, nobody is pretending that's inherently expensive (even for short runs). You're paying for the direct line to the vendor to go "hey, I need to do Random Edge Use Case X and it's not working right, fix it!" and have them actually listen and work with you to fix it.
I think that they could sell it to masses without support and make huge moneys as something like high performance version of Rapsberry PI for $99, because if can do same thing for big company as Nintendo, they should do it also for us normal peasants.. Companies are not > than people.
Nvidia has no interest or need in making a RPi competitor. These dev boards are expensive because they only make a (relatively) small number of them so that people will buy the chips on their own.
you seemed to have missed the initial part of the article, where it says Nvidia realized the market was for things ready to deploy. Is not creating a competitor for Raspberry Pi, it's the philosophy behind the idea of a portable platform, something you can use in a drone, a small surveillance camera, etc...
Companies have more buying power and a greater need for this sort of hardware than individual consumers do. When a business is creating a product for sale to a particular group of buyers, they take a lot of factors into account. I suspect that NV considered companies were indeed greater than people during the course of the design and product pricing process. That isn't always the case for NV, but it was this time around.
I thought there was a Core M Compute Stick. Or at least, was...
Tesla reckon they can code a full Level 5 self-driving car using the DrivePX2 which has the same compute as this; I doubt a Core M can do that. Nowhere near enough GPU power.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
59 Comments
Back to Article
ddriver - Tuesday, March 7, 2017 - link
Once again - too expensive. Even more "too expensive" than the previous one. Not that there can be a direct comparison between market niches, but for that amount of money you can get a x86 itx board with a quad core i7, 16 gigs of ram and a decent gpu.Other vendors sell dev/eval boards below their actual value in order to hook up OEMs to use the platform in their products, and then there is nvidia, too big (or small) to miss on the opportunity to rake in profits on something as limited in volume as dev boards.
Way to go nvidia, I mean winning designs for a chip that costs like 15$ to make and sells at 400$ to OEMs... That would actually explain why nvidia socs are nowhere to be found in consumer products.
jordanclock - Tuesday, March 7, 2017 - link
Will that ITX i7 system use 15 watts?ddriver - Tuesday, March 7, 2017 - link
You appear to have missed the "Not that there can be a direct comparison between market niches" part.BTW this product is marketed for automotive. Not mobile phones or tablets, where battery life is critical. Vehicles which have kilowatts of power at their disposal, which would have no problems running more power hungry platform.
So while an i7 itx platform would probably not use 15 watts, it will offer much better performance and value for the price and power consumption. And being more power hungry would not be detrimental if you put it in a car. So no, fitting in a 15 watt power budget is not a big whoop, and certainly doesn't justify the ridiculous price.
Meteor2 - Wednesday, March 8, 2017 - link
So why make your comparison, if you admit it's not possible?By all accounts there's a market for this board; it is being purchased and it is being used. That suggests the pricing is correct. If it were cheaper it would likely not leave enough margin for future R&D to continue moving the state of the art onwards.
Vatharian - Wednesday, March 8, 2017 - link
Where you're wrong, that there is no direct comparison. There is in form of Atom boards, but they're a bit underpowered. But then again, there is literally copetitor in form of Intel Gordon Ridge, which is nearly ready made into automotive format, with ready-to-install kits.And even there, i7, or really any architecture beyond Haswell easily scales from 217 to 1.3W, even on same chip.
As long as you can create robust silicon with abundance of GPIO/can and decent software, anything is open game.
Also main limitation of automotive isn't power (which is important in standby, by the way, no one likes to discover that your car battery has gone flat in a week you haven't been driving your vechicle), but cooling. Leaving a car in 50°C/120°F for a day shouldn't damage the thing.
hecksagon - Wednesday, March 8, 2017 - link
Power isn't a problem in automotive, what is a problem is heat. In automotive you have to validate a design for every conceivable environment. This includes days of -40°F and 120° temperatures, with a hot engine running and limited airflow inside the dash. You think a desktop i7 is going to run in a cramped dash with no airflow? Is it going to last for the life of the vehicle? The nVidia solution is much better because it keeps the TDP low, performance relatively high, has soldered components, and may even require no active cooling in production versions. That removes or mitigates dozens of failure modes.extide - Sunday, March 12, 2017 - link
There is always this: http://www.anandtech.com/show/11052/asrock-shows-d... -- ITX w/MXM -- so you can put a 35W i7 in there with a 1080 MXM...extide - Sunday, March 12, 2017 - link
Sorry, MicroSTX, so even smaller...BertrandsBox - Tuesday, March 7, 2017 - link
This isn't for your everyday computing. It's not meant for tablets, desktops, or anything of that sort. This is specifically developed for 'edge cases' in AI where it becomes desirable to shift processing on-location and not be constantly sending data back and forth to a data center to make sense of anything.Your example of an x86 board, with a quad core i7, 16 GB of RAM, and a 'decent' GPU would take up at a minimum, ~6.64 times more space (assuming mini-ITX), consume untold times more energy, and certainly isn't going to fit in anything like a traffic cam, like this is designed for.
I'm gonna have to assume you're trolling, because even there's no way anybody would write such an asinine comment without doing it on purpose.
ddriver - Tuesday, March 7, 2017 - link
I am going to assume you are an nvidia fanboy, who has never seen an itx board in real life, because if you had you'd realize it is not that much bigger than this dev board. And judging by laptop mobos, i7 can fit into much smaller spaces and into much more modest power budgets. Amazing, I know.Untold times more energy? Maybe numbers beyond 4 boggle your mind too? There is no magic to this, it is not intrinsically superior, the computing performance it delivers is in line with the power it consumers. A vega based gpu has 25 tflops of fp16 throughput at below 250 watts, meaning less than 10 watts per tflop, whereas this thing gets its tflop at 15 watts, while costing as much as solution that is 25 times more powerful. You mean to tell me that if amd were to chop off 1/25th of a vega to get a solution that does 1 tflop per 10 watts of power budget, that would be such an achievement that it will justify asking the same price for this 1/25th as for the 25 times more powerful solution?
It is not a miracle product. It's power requirements are well in line with the performance it offers. It consumers little power because it doesn't have too many transistors and it doesn't offer that much performance. Its power requirements do not justify its ridiculous cost. It should cost like 50 bucks for the module, 75 for the dev board.
Meteor2 - Wednesday, March 8, 2017 - link
I don't understand, your common sense seems to have gone out the window. AMD Vega performance? It's not been released, we don't know what it is. Chop off 1/25 of a Vega GPU? What about the chipset, the CPU?Elsewhere you talk about i7 and cheap dGPUs. Are you seriously suggesting that you can get more perf/W in a 15W TDP package using x86 and non-latest-gen GPU uarch?
philehidiot - Wednesday, March 8, 2017 - link
Ladies, please! You're both wrong. ;)(I have absolutely no idea what you're talking about).
Meteor2 - Wednesday, March 8, 2017 - link
:-)nico_mach - Wednesday, March 8, 2017 - link
It's not a miracle product, but these applications specifically require a higher ratio of GPU:CPU power, making this product superior in those places. There's a reason they've released this and there's a reason that intel boards in this area always have dgpus. It's absolutely essential and if AMD has really caught up to Intel, the Intel piece will be entirely overpriced.The way ARM has improved, Intel's presence in any embedded mobile market is probably temporary.
BertrandsBox - Wednesday, March 8, 2017 - link
Yes I have seen an ITX board, which is why I actually did the maths and gave you a number. A mini-ITX motherboard is 6.64 times the size of the TX2. And no, I'm not an NVIDIA fanboy, considering the last product I owned from them was the GeForce 6200 over a decade ago.You're comparisons are becoming more and more ridiculous. You can't just 'chop' off 1/25th of an UNRELEASED chip and expect to achieve the same results in efficiency, never mind the fact that you've just conveniently ignored the complete lack of CPU in your 10W example.
Speaking of lacking, where do all the other components fit into your imaginary board? ac WIFI? Bluetooth? 8GB LPDDR4 RAM? 32GB storage?
I assume in your wonder board, these don't cost any money and don't consume any power.
hahmed330 - Wednesday, March 8, 2017 - link
Intel GPUs, especially in atom based SOCs, are rubbish they always have taken more die area/performance, very low utilisation efficiency, poor power efficiency, non-existent AI capability (no efficient FP16 or INT8 capability) and are very imbalanced designs.Yojimbo - Tuesday, March 7, 2017 - link
Pretty sure there's nothing else on the market that can do what this thing can do. I can say that with confidence because if there were it would be used in self-driving car platforms. "Not that there can be a direct comparison between market niches, but for that amount of money you can get a x86 itx board with a quad core i7, 16 gigs of ram and a decent gpu." For that amount of money one can also get 100 custom made t-shirts. What's your point?Costs $15 to make? Are you talking about manufacturing costs? Firstly maybe the SoC alone costs $15 to make(?) I'm guessing the whole board is a lot more than that. But so what? It's part of a multi-billion dollar R&D effort spanning software and hardware.
ddriver - Tuesday, March 7, 2017 - link
15$ for the chip, 30$ for the module, 60$ for the dev board. Those would be realistic production costs, I mean how much it costs them to make it. What justifies asking 10 times more than its worth?"Multi-billion" R&D costs are covered by selling a lot of them chips at reasonable margins, not by selling a few but ridiculously overpriced. And if it really did cost "multi-billion" to R&D such products, they nvidia would be bankrupting itself as it hasn't got anywhere near that amount of revenues from automotive since they got into, and it would eat most of their revenues from other market niches. Which would conflict with their financial results, which indicate that the actual R&D cost for those products in far from "multi-billion" and more like "several-million". And that's the beauty of IP - once you do the design, you can churn millions upon millions of products based on it, which reduces the R&D cost per product to almost negligible. Especially when the IP is not something brand new from scratch but barely incremental update of a previous design.
Quite frankly, while it is understandable why I would criticize the pricing, not only from my perspective, but also from the perspective that they wouldn't be able to win many designs at that cost, it is quite curious why individuals like you rush so desperately to defend it? Why are people like you acting as if I have said something bad about your mommas?
Buk Lau - Wednesday, March 8, 2017 - link
Thank you ddriver for actually making sense. This new Jetson board is just way too overpriced to even be properly justified for its "niche market."Meteor2 - Wednesday, March 8, 2017 - link
I'm at a loss over why people are crying off over the price. Do you want one, but can't afford it? Or do you fear for the budgets of the University of Oxford and Tesla?willis936 - Wednesday, March 8, 2017 - link
Or, you know, they'd like to see many companies succeed so technologies are more competetive. If no one cared then no one would be reading a pointless article on a pointless site.ddriver - Wednesday, March 8, 2017 - link
This is not a consumer product. It is not the kind of thing you "want one from". I could put this this to a good use, and buy tens of thousands, had it been worth the money. At this price it is not.Once again, what's more puzzling here is why people like you defend the ridiculous pricing with such dedication. And sure, some big OEMs with massively overpriced products could afford it. A billionaire could afford to blow his nose on 100 dollar bills, but do the do that?
Yojimbo - Wednesday, March 8, 2017 - link
""Multi-billion" R&D costs are covered by selling a lot of them chips at reasonable margins, not by selling a few but ridiculously overpriced. And if it really did cost "multi-billion" to R&D such products, they nvidia would be bankrupting itself as it hasn't got anywhere near that amount of revenues from automotive since they got into, and it would eat most of their revenues from other market niches."The multi-billion dollars spans all their products. Though there are some additional hardware and firmware costs associated specifically with Tegra and specifically with Jetson, and there is similarly software development that covers Jetson but not, for instance, gaming.
"And that's the beauty of IP - once you do the design, you can churn millions upon millions of products based on it, which reduces the R&D cost per product to almost negligible."
It's not even close to almost negligible! You can't just conveniently divide their costs the way you see fit for your rant of the day.
"15$ for the chip, 30$ for the module, 60$ for the dev board. Those would be realistic production costs, I mean how much it costs them to make it. What justifies asking 10 times more than its worth?"
The price of the board isn't based on cost. The cost limits how low of a price they can charge but it doesn't set the price. The price is based on value. NVIDIA is able to charge the money it can charge because of the value their product provides. That's how money is made, by providing a good or service with a value that is greater than the cost to provide the good or service. The fact that you think that the cost to produce the board (whether you figure in the r&d and sg&a costs or not) is what the board is "worth" shows that you don't understand the very basics of business or economics.
"Quite frankly, while it is understandable why I would criticize the pricing, not only from my perspective, but also from the perspective that they wouldn't be able to win many designs at that cost, it is quite curious why individuals like you rush so desperately to defend it? Why are people like you acting as if I have said something bad about your mommas?"
This is a public message board and when we see you say something stupid we point it out. And the things you have said are quite stupid in a basic economic sense. Don't get mad about it. Educate yourself. As far as getting more design wins, it's interesting (i.e. laughable) that you think you can judge the supply/demand curve of the developer board market better than the company themselves can. But from the things you've said you most likely don't have any idea of what a supply/demand curve is or what it means. You're just basing your judgments on thin air. If you think a product is worth its cost to produce then a supply/demand curve is useless to you. Of course if you were making the board I hope eating would be useless to you as well, as well, but maybe you'd add eating cost into the worth of the product. If you wanted a shiny new Ferrari I guess you'd have to add that into the developer board's worth, too.
Itselectric - Wednesday, March 8, 2017 - link
You are wrong, they're not saying that they should sell at cost, but that with R&D included their cost is maybe 120 all in per dev kit. When they ask 600 for it, that's grossly overpriced and their marginal cost will not be equal to their marginal revenue.Meteor2 - Wednesday, March 8, 2017 - link
I still don't get why anyone cares. Ferraris are expensive. Rolexes are expensive. Boeings are expensive. It happens.I'msureyouareatroll - Wednesday, March 8, 2017 - link
Wrong. That's branding and it's specifically designed to target different users/audiences. I know nothing about you but, assuming you are a mortal as I am, these things weren't designed to target you or me. The massive dev board of the TX1/2 is completely useless, if you have to install any peripheral via usb, like a camera, you end up with just one spare micro-usb port, which means you need to buy an extra usb hub in order to connect mouse-keyboard as a minimum. You then have to invest another 300-500 € to buy a dedicated carrier board if you want to take full advantage of the jetson TX1 size. If you don't believe me, go and check the prices of carrier boards sold by auvidea or connecttech.On top of this, support is virtually nonexistent, and the community is virtually no one. You are left hanging with all sort of questions for days/months, the so-called advantages of, for example, connecting several cameras to the Jetson is virtually a wet dream: there are no drivers available and if you want the only ones in existence you've got to pay another €2500.
If you compare this thing with a widely use Raspberry pi in terms of community/support and even hardware usability, I can take this thing will never take off.
jospoortvliet - Thursday, March 9, 2017 - link
Dude, what are you even talking about? What is 2.5k if you are designing a car and need a hand full of these devices boards? These prices are perfectly fine for the niche this board is aiming at. It isn't aiming at home users playing with raspberry pi boards...Yojimbo - Wednesday, March 8, 2017 - link
No, he specifically said $15 dollars to produce. And then he said it was worth what it costs, so obviously he wants it to be sold at cost.ddriver - Wednesday, March 8, 2017 - link
Oh so you are a stupidity pointer out, are you. That's quite amusing, considering your post is riddled with stupidity and general lack of understanding of the subject or even common sense. Unlike you I have many years of experience in that field, I know how much stuff costs, and how much stuff is worth.nvidia is pretending to be intel of lately, they act all high and mighty and expect ridiculous margins. Which is why their mobile platforms are nowhere to be seen. They are aiming to milk one particular cow they know is easiest to milk, but they are not in the position to do so. This product in particular is for the time being unique, but that wont last long. We are few months away zen apus which would make this product irrelevant, at least at that price. With the added benefit you don't get locked into cuda, but get to use modern OpenCL which runs on a variety of other platforms, and can even be compiled to verilog and put directly into fpgas or even silicon.
Sure, they are making some money on automotive, but I highly doubt the actual reason for this is their products are worth the money, corporate interests and politics are at play there. Not logic, not reason, not the consumer's best interest.
Yojimbo - Wednesday, March 8, 2017 - link
"That's quite amusing, considering your post is riddled with stupidity and general lack of understanding of the subject or even common sense. "OK. show me what I said that shows lack of understanding or common sense.
"Unlike you I have many years of experience in that field, I know how much stuff costs, and how much stuff is worth."
Saying it's so doesn't make it true.
"We are few months away zen apus which would make this product irrelevant, at least at that price. With the added benefit you don't get locked into cuda, but get to use modern OpenCL which runs on a variety of other platforms, and can even be compiled to verilog and put directly into fpgas or even silicon."
Yeah sure. Let's wait a couple years and see how that works out. Using OpenCL might be great if there were both strong hardware and strong libraries for it. Even if Zen APUs are competitive, AMD is going to leave it up to everyone else to create the development tools. It's something that takes years to do.
"Sure, they are making some money on automotive, but I highly doubt the actual reason for this is their products are worth the money, corporate interests and politics are at play there. Not logic, not reason, not the consumer's best interest."
Yup, everyone is stupid except you. NVIDIA, their customers... You just aren't able to say why. Must be some big conspiracy. If only the world wasn't corrupt you'd be a multi-trillionaire by now.
ddriver - Wednesday, March 8, 2017 - link
Yeah, today goldman sachs tells people to buy nvidia, to anyone who is not evil or retarded that means "do not buy nvidia".Granted, nvidia has the tools, and the libraries, but that's just bait to lock in the lazies. Not everyone is lazy and talentless, not everyone needs to be held by the hand like a little baby.
I already have enough money to not really even care or think about money. That doesn't mean I outta be wasting it on overpriced, poor value stuff that is not worth it. Everyone is stupid, that's true. I am stupid too. Just less stupid than most. I am smart enough to know what I am stupid about. Unlike you ;)
jospoortvliet - Thursday, March 9, 2017 - link
Dude. Take your meds.TheJian - Friday, March 10, 2017 - link
Let me know when AMD has 8yrs of R&D and a few billion stuck in OpenCL development. They can't even properly launch a cpu (see reviews, games don't work right, SMT screwed, boards not even ready etc) or gpu (see last gen). If AMD doesn't actually start MAKING money at some point they're screwed. They have lost $8B+ in the last 15-20yrs. That's not good right? They've laid off 30% of their engineers in the last ~5yrs. They've been restructuring for 5yrs. The "tools and the libraries" are what you pay the extra dough for. Cuda works because they stuck a decade of development and cash into it. It's taught in 450 universities across a few dozen countries.The point of the tools etc is smaller guys can get in. The point of using something like unreal engine is a smaller guy can make a credible game. You don't seem to get the point of all this stuff. not everybody has the time or money to develop an end to end solution (even larger companies buy qualcomm etc to get the modem and all in one for mobile etc) so part of the value of a device like this (or drive px etc) is all that you get on top of the device.
10yrs ago I would not have thought about game dev. It would have taken 10yrs to make a crappy game. Today on multiple engines (take your pick) I can make something worth paying for in a few years alone if desired. If you think that guy doing this is lazy or talentless you're dumber than you think ;) Sorry you're stupid. I'm ignorant about some stuff (cars, couldn't care less about learning them), but because I choose to be. But I'm not stupid. Comic you mention the stock, I'm guessing it will be $125-150 in the next year (under $100 today - $20 off in the last month). Auto's will at some point make them money on socs (and I think they'll re-enter mobile at 10nm or below as modem's can be added without watt costs etc), and AI/Big data will get them the rest of the way. Record revenue, margins, income will keep happening. Next rev of cards will probably be able to be priced another $50 across the board because Vega won't likely be able to do much against either Nvidia's new lineup of revved up boards with faster mem (GDDR5x on almost everything shortly and faster clocks across the lineup on top), or if that isn't enough we'll probably see 12nm Voltas for xmas if needed or at least Q1. Worst case NV just lowers prices until they put out the winner again just like Intel would do. Unlike AMD, both of their enemies can fight a price war for a year and win it next year. AMD will do better vs Intel (should get some expensive server chip sales) than nvidia. Intel has been paying so much attention to racing down to ARM they forgot about AMD even being alive. Nvidia hasn't done that and likely has multiple ways to play this out without a change in market share or much margin loss. Unlike Intel they never sat on their laurels. They've forged ahead and even taken the smarter/cheaper (GDDR5x) and much easier to produce route. HBM2 like HBM will be a problem for AMD going alone. If NV was in it maybe it wouldn't be expensive (NV could put what AMD is pulling), but alone they'll be killing profits just like last time and already late again just like last time giving NV more room to make adjustments.
It's comic AMD's slides compare HBM2 to GDDR5. That isn't what the competition will be using. They're going to be top to bottom GDDR5x shortly except for the bottom card. NV has the next 3 months to sell 1080ti and capitalize on top pricing then be able to cut if needed and not lose much having already milked the cow. Unfortunately for AMD, HBM2 held them up yet again (just like the first rev, not to mention will probably limit supply again just like HBM1). Benchmarks have shown Vega beating 1080 by 10%. Unfortunately it's now facing 1080ti (running like Titan) due to HBM2 just hitting production and delaying Vega. Lastly Raja said the driver team is almost entirely dedicated to Vulkan now:
"I only have a finite number of engineers so they’re just focused on Vulkan."
That means DX11 people, OpenGL will be left wanting. So even if Vega ends up 20-30% faster than 1080 in what they like (vulkan/dx12?), 1080ti will likely at worst tie it in most stuff and if needed they can always put out a card with 30 sm units on instead of 28 right (just a p6000 at that point right? Surely there are a few extras lying around)? Surely they have cherry picked enough titan chips by now that fully work "just in case" they're needed. I see them constantly talking 4k which again is ignoring the fact that 95% of us are running 1920x1200 or lower. Who cares if you win where nobody plays? They seem to be forgetting a full 50% of the market is running win7 and dx11 too. I won't likely be going win10 unless forced (2020? ROFL). There aren't enough games coming this year to make me want win10/dx12 and vulkan will run in win7. But I don't see a ton of Vulkan patches coming so far for older current games. Things could change but we'll see. I'd rather bet on what I can WIN today, not what you hope might happen one day. How long have people waited for bulldozer to be a winner? How long will it take for ZEN to get fixed on gaming? Will it ever? Since AMD themselves said already looking for gains on Zen2. PCper thinks games will look the same on Ryzen for good (so no faith in fixes in their opinion based on AMD talk).
Looks like we'll get two cards with about the same bandwidth, etc, but with NV having the dough to make drivers for all apis not just vulkan. Not doubting AMD will have great hardware, its the drivers that will likely keep them down. Raja himself said they're completely focused on Vulkan (so ignoring DX12, Dx11, OpenGL for now? Perhaps DX12 good enough?). Not a happy camper when AMD comes right out and says basically both products are short on R&D money. Now that we've seen 1080ti (just reading reviews...LOL). Board partners will make it even faster. Hope AMD can make enough vega to at least pull down some cash with it (meaning HBM2 limited again probably).
LostInSF - Wednesday, March 8, 2017 - link
$15 to buy a brand new Nvidia Parker SOC? Are you serious? If you have the source, I'll buy thousands of Nvidia Parker from you. LOLddriver - Wednesday, March 8, 2017 - link
15$ to make it. Production cost, doh!. What it outta cost is 50$. Put the money on the table, and I will sell you as much from stuff that costs me 15$ for 50$ as you want.LostInSF - Thursday, March 9, 2017 - link
LOL! production cost? iphone 7 production cost is <$200, but sold at $700. Go and accuse every company why you all don't sell at your production cost?TheJian - Wednesday, March 8, 2017 - link
Even with your prices it's $100 so 10x would be $1000. Also, tell your story to Intel who was losing 4.1B a year on mobile. Nvidia same story just less and selling far less. BTW, the soc likely costs more than $30 to make now maybe quite a bit more since they make nowhere near say apple:http://news.ihsmarkit.com/sites/ihs.newshq.busines...
Apple's A10 is $27. I'm fairly certain NV's chips are above this since they are not likely making 50-100mil of them. In a document of a teardown of Xiaomi Mi 3 the soc cost was $27 and that was a LONG time ago. They are growing in size. IE, the upcoming Volta soc is expected to be 300mm. That isn't cheap to make. Consider AMD's gpu for ps4/xbox1 is not much bigger and costs 90-105 to make and they sell them for $100-110 (ps4/xbox1 respectively) upon release when they said they had single digit margins (now supposedly at mid-teens, which I take to mean not more than 15% and the Q reports and sales of both units back this up). The Volta chip is expected to be 7B transistors. Barely incremental updates? Even mighty Intel was losing 4.1B a year...LOL. A 14nm 165w Broadwell (24 core IIRC) has 7.2B transistors so you should be able to see the complexity here. For perspective the GTX 1080's die size is 314mm^2 and also has 7.2B transistors. So the new Volta Soc is about as big as GTX 1080's die. You don't just have to R&D it either, you have to pay to tape it out etc.
https://semico.com/content/soc-silicon-and-softwar...
A $20 soc design is required to ship 10mil units just to break even on older tech. Getting more expensive now with coming 10nm. Not to mention upwards trend of software costs to go with it (69% cagr per shrink from 28 down to 7nm!). I don't think they're talking chips the size of Nvidia's either (nor a samsung/apple). More likely some chinese crap.
"Total SoC design costs increased 89% from the 28nm node to the 14nm node and are expected to increase 32% again at the 10nm node and 45% at the 7nm node."
They mention the number of low cost crap keeping Avg costs down. But that isn't Apple/Nvidia/Samsung's top socs. To date, Nvidia hasn't made a dime on their socs. That probably won't happen until the soc segment reaches 1B revenue. Last I checked they are FAR from that (about 500mil/yr). Since they sell software with the hardware (total solution for cars) it might be less than a billion needed to break even now but they still have yet to make money in this segment, so you're really not making sense here.
http://electroiq.com/petes-posts/2015/01/26/expone...
Another showing costs blowing up.
"McGregor said the tapeout costs to do a single device are very high. “You have to put $100 million into a semiconductor startup today to be able to get to productization.” This means that big companies will be getting bigger. “There will still be some small companies – but I think the mid-sized company in our industry, in devices, is going to dramatically go away because of the scale and other things required,” he said.
This crap isn't cheap. You should get the point. It might not be several billion, but they do spend about 1.5B a year in R&D now. They make a new IP and spread it over pro/datacenter/auto/gamer etc. If it wasn't for the others making money, the socs would have died by now. Auto etc is looking promising at some point but this is like spending 8yrs on Cuda to get it entrenched (and billions). I wish they'd make a 500w pc like box that would accept their discrete gpus (I'd hoped this would be rev2 of shieldTV but that hasn't come yet, maybe rev3?). I think they'd do much better offering a box that is far closer or even better (with discrete gpu) than xbox1/ps4 etc. It can't be that hard to put it in a much larger box and strap a heastink/fan on it and run it at 50-100w.
You don't just do IP and then churn out cheap design after cheap design. EACH design costs a bunch of money to make and support (software) before you even get it into a product. IE, in the article above the guy mentions a company needing 100mil to get to productization. Hence the smaller companies dying soon or never even starting up. He also mentions they get pretty much nothing from the software and now have far more software engineers than hardware which ends up just being the gift wrap around the chips as he says. It's not as cheap as you're making things out to be. Intel couldn't make a DIME on mobile for years (4B+ losses yearly until they gave it up).
16GB of DDR4 for that i7 is $85-100 alone, never mind the chip price etc...LOL. You seem confused about the price that Intel device would be selling for to make back all the design costs of everything in it and make profit. Jetson TX2 here isn't a raspberry pi. :) You aren't the target market, so get over it. Go buy a raspberry pi or get a better job so you can afford better toys...ROFL. This is being sold to car dealers/universities etc.
renz496 - Wednesday, March 8, 2017 - link
dev board have always been more expensive. why did you try to compare it with regular PC part that we can buy separately? Qualcomm dev board for snapdragon 820 (Open-Q 820) pretty much cost the same:https://shop.intrinsyc.com/products/open-q-820-dev...
A5 - Wednesday, March 8, 2017 - link
Yeah, dev boards are always expensive. I've used reference FPGA boards that cost several times more than this.BrokenCrayons - Wednesday, March 8, 2017 - link
For an embedded development kit, the price is pretty reasonable. Besides that, these things are likely to sell to institutional and corporate buyers rather than individual tinkerers. Those organizations won't flinch at the price.I'msureyouareatroll - Wednesday, March 8, 2017 - link
Wrong. This is branded and marketed as a product for developersBrokenCrayons - Wednesday, March 8, 2017 - link
Developers have employers. Employers typically purchase development hardware for their employees.LostInSF - Wednesday, March 8, 2017 - link
Who told you dev board is cheap? Almost every semi companies sell dev board several times more than it values, except those who gives away for free to grab market share. I just got a TI DLP dev board for $5000. Is it also over-priced? Yes! But, considering maybe hundreds of engineers worked on it, it worth it and it is the business.eddman - Thursday, March 9, 2017 - link
Because the market is full of similar solution at dirt cheap prices with a 750 GFlops GPU, right? /sAlistair - Tuesday, March 7, 2017 - link
Woh, that price. A man can dream of this as a home console only version of the Switch can't he? sigh...psychobriggsy - Wednesday, March 8, 2017 - link
Yeah, the fact that the Switch appears to not be using this chip but the X1 just rubs more dirt in.Half-the-power in efficiency mode would have added an hour or two to mobile gaming operation without compromising the performance. And the docked mode would have been far better, and maybe Zelda wouldn't be dropping to 20fps...
edzieba - Wednesday, March 8, 2017 - link
Hah! My first thought was "damn, that's a cheap dev-board". Guess most commentators are used to buying consumer mass-production stuff, and aren't used for thousand-dollar dev-boards for single/double-digit-dollar ICs being the norm.You're not paying out the nose for the hardware, nobody is pretending that's inherently expensive (even for short runs). You're paying for the direct line to the vendor to go "hey, I need to do Random Edge Use Case X and it's not working right, fix it!" and have them actually listen and work with you to fix it.
t.s - Wednesday, March 8, 2017 - link
..and you said that windows NUC is overpriced.Meteor2 - Wednesday, March 8, 2017 - link
How many CUDA cores does it have?Otherwise it seems to be the same as the Drive PX2.
Ryan Smith - Wednesday, March 8, 2017 - link
Parker has 256 CUDA cores.GoodBytes - Wednesday, March 8, 2017 - link
The chip in the picture looks like the Switch CPUGoodBytes - Wednesday, March 8, 2017 - link
*CPU = Tegra SoCruthan - Wednesday, March 8, 2017 - link
I think that they could sell it to masses without support and make huge moneys as something like high performance version of Rapsberry PI for $99, because if can do same thing for big company as Nintendo, they should do it also for us normal peasants.. Companies are not > than people.A5 - Wednesday, March 8, 2017 - link
Nvidia has no interest or need in making a RPi competitor. These dev boards are expensive because they only make a (relatively) small number of them so that people will buy the chips on their own.I'msureyouareatroll - Wednesday, March 8, 2017 - link
you seemed to have missed the initial part of the article, where it says Nvidia realized the market was for things ready to deploy. Is not creating a competitor for Raspberry Pi, it's the philosophy behind the idea of a portable platform, something you can use in a drone, a small surveillance camera, etc...BrokenCrayons - Wednesday, March 8, 2017 - link
Companies have more buying power and a greater need for this sort of hardware than individual consumers do. When a business is creating a product for sale to a particular group of buyers, they take a lot of factors into account. I suspect that NV considered companies were indeed greater than people during the course of the design and product pricing process. That isn't always the case for NV, but it was this time around.Shadowmaster625 - Wednesday, March 8, 2017 - link
I was thinking a Core m compute stick would make more sense than this. There should be enough cpu and gpu in that to do the type of work this can do.Meteor2 - Wednesday, March 8, 2017 - link
I thought there was a Core M Compute Stick. Or at least, was...Tesla reckon they can code a full Level 5 self-driving car using the DrivePX2 which has the same compute as this; I doubt a Core M can do that. Nowhere near enough GPU power.
peevee - Thursday, May 11, 2017 - link
"$599 retail"You can build a PC with better performance cheaper. What are they smoking?