btrav13
Jun 12, 10:02 AM
However, you are unfortunately stuck in the position that if you buy the device, you are buying ATT service. As long as this continues to happen, then Apple really doesn't have any incentive to move it to other carriers. I mean, technically they do, but if there are service complaints, yet the very same people who complain still continue to purchase the new one ever year, then that's not sending a very strong message, in my opinion.
bugfaceuk
Apr 9, 10:28 AM
Also...
I like the idea of being able to take 3D pictures with the Nintendo 3DS, but that's not worth $250 to me... not at such low resolutions and not when I use my iPhone 4 so much. I like Nintendo, but I don't think they're making good decisions to protect their future. Why don't they work more with independent developers? Why didn't they build their own app store for independent developers? Why not team up with Apple, like Sony sorta is doing with Android?
Nintendo did really well during the last few years. But now, Apple is becoming a threat. If you acknowledge the threat to Nintendo or not, that's irrelevant. Why? It's because Nintendo acknowledges the threat.
http://www.businessinsider.com/nintendo-execs-admit-apple-is-the-enemy-of-the-future-2010-5
Your overall point being because Apple poses and threat to Nintendo, which Nintendo recognises, Nintendo are doomed to go out of business?
I like the idea of being able to take 3D pictures with the Nintendo 3DS, but that's not worth $250 to me... not at such low resolutions and not when I use my iPhone 4 so much. I like Nintendo, but I don't think they're making good decisions to protect their future. Why don't they work more with independent developers? Why didn't they build their own app store for independent developers? Why not team up with Apple, like Sony sorta is doing with Android?
Nintendo did really well during the last few years. But now, Apple is becoming a threat. If you acknowledge the threat to Nintendo or not, that's irrelevant. Why? It's because Nintendo acknowledges the threat.
http://www.businessinsider.com/nintendo-execs-admit-apple-is-the-enemy-of-the-future-2010-5
Your overall point being because Apple poses and threat to Nintendo, which Nintendo recognises, Nintendo are doomed to go out of business?
Hastings101
Apr 5, 08:36 PM
Are you guys sure that switching is really "worth it"? (serious question)
I don't think it's really worth it. Windows 7 and Snow Leopard are so close together in quality that OS X is no longer obviously the better operating system (in my opinion of course). It's also a pain to have to replace your entire collection of Windows applications with Mac versions or Mac alternatives.
The only reason I still use OS X is because I like the look of it, I like that there are (at the moment) less viruses/trojans/whatevers, and I have way too many Mac only applications that I depend on.
I don't think it's really worth it. Windows 7 and Snow Leopard are so close together in quality that OS X is no longer obviously the better operating system (in my opinion of course). It's also a pain to have to replace your entire collection of Windows applications with Mac versions or Mac alternatives.
The only reason I still use OS X is because I like the look of it, I like that there are (at the moment) less viruses/trojans/whatevers, and I have way too many Mac only applications that I depend on.
skunk
Apr 24, 10:50 AM
I'm just entertaining the notion of agnosticism as a kind of nod to the great debt we owe Judaism and Christianity. If it wasn't for those two faiths which allowed for reformations (such a thing would be impossible under, say, Islam) then secular Western democracies would be vastly different.What do you mean by "allowed for"? Do you mean that they could have slaughtered more people in the wars of religion? As for Islam, we probably would not have had a Renaissance without Islam.
If Europe had succumbed to the advance of Islam, if Vienna had fallen in the 17th century things likely would be very different today. Europe would have produced as many Nobel Prize winners as the entire Islamic WorldWe would all be speaking German I expect.
If Europe had succumbed to the advance of Islam, if Vienna had fallen in the 17th century things likely would be very different today. Europe would have produced as many Nobel Prize winners as the entire Islamic WorldWe would all be speaking German I expect.
Lesser Evets
Apr 28, 07:35 AM
Almost all of that is due to the iPad. They had around 4% of the global market for computers last year.
And growth is bad?
And growth is bad?
peharri
Sep 23, 10:25 AM
Perhaps we've just been exposed to different sources of info. I viewed the sept 12 presentation in its entirety, and have read virtually all the reports and comments on macrumors, appleinsider, think secret, engadget, the wall street journal, and maccentral, among others. It was disney chief bob iger who was quoted saying iTV had a hard drive; that was generally interpreted (except by maccentral, which took the statement literally) to mean it had some sort of storage, be it flash or a small HD, and that it would be for buffering/caching to allow streaming of huge files at relatively slow (for the purpose) wireless speeds.
I've read absolutely everything I can too and I have to disagree with you still.
It makes absolutely no sense for Bob Iger to have been told there's "some sort of storage" if this isn't storage in any conventional sense. Storage to a layman means somewhere where you store things, not something transitory used by the machine in a way you can't fathom. So, we have two factors here:
First - Bob's been talking about a hard disk. That absolutely doesn't point at a cache, it's too expensive to be a cache.
Second - Even if Bob got the technology wrong, he's been told the machine has "storage". That's not a term you generally use to mean "transitory storage for temporary objects".
The suggestion Bob's talking about a cache is being made, in my view, because people know it'll need some sort of caching to overcome 802.11/etc temporary bandwidth issues (Hmm. Kind of. You guys do know we're talking about way less bandwidth requirements than a DVD right - and that a DVD-formatted MPEG2 will transmit realtime on an 802.11g link? What's more, for 99% of Internet users, their DSL connection has less bandwidth than their wireless link, even if they're on the other side of the house with someone else's WAN in range and on the same channel. Yes, 802.11 suffers drop-outs, but we're talking about needing seconds worth of video effected, not hours) As such, you're trying to find evidence that it'll deal with caching.
YOU DON'T NEED TO. A few megabytes of RAM is enough to ensure smooth playback will happen. This is a non-problem. Everyone who's going this route is putting way too much thought into designing a solution to something that isn't hard to solve.
Nonetheless, because it's an "issue", everything is being interpreted in that light. If there's "storage", it must be because of caching! Well, in my opinion, if there's storage, it's almost certainly to do with storage. You don't need it for caching.
I'm trying to imagine a conversation with Bob Iger where the issue of flash or hard disk space for caching content to avoid 802.11 issues would come up, and where the word "storage" would be used purely in that context. It's hard. I don't see them talking about caches to Iger. It makes no sense. They might just as well talk about DCT transforms or the Quicktime API.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
Sure. I'm perfectly willing to be wrong too. I'm certainly less sure of it than I am of the iPhone rumours being bunk.
Regardless of the truth, I have to say the iTV makes little sense unless, regardless of whether it contains a hard disk or not, it can stream content directly from the iTS. Without the possibility of being used as a computer-less media hub, it becomes an overly expensive and complicated solution for what could more easily be done by making a bolt-on similar to that awful TubePort concept.
I'm 99% sure the machine is intended as an independent hub that can use iTunes libraries on the same network but can also go to the iTS directly and view content straight from there (and possibly other sources, such as Google Video.) I can see why Apple would make that. I can see why it would take a $300 machine to do that and make it practical. I see the importance of the iTS and the potential dangers to it as the cellphone displaces the iPod, and Apple's need to shore it up. I can see studio executives "not getting it" with online movies if those movies can only be seen on laptops, PCs, and iPods.
If Apple does force the thing to need a computer, I think they need to come out with an 'iTunes server' box that can fufill the same role, and it has to be cheap.
I've read absolutely everything I can too and I have to disagree with you still.
It makes absolutely no sense for Bob Iger to have been told there's "some sort of storage" if this isn't storage in any conventional sense. Storage to a layman means somewhere where you store things, not something transitory used by the machine in a way you can't fathom. So, we have two factors here:
First - Bob's been talking about a hard disk. That absolutely doesn't point at a cache, it's too expensive to be a cache.
Second - Even if Bob got the technology wrong, he's been told the machine has "storage". That's not a term you generally use to mean "transitory storage for temporary objects".
The suggestion Bob's talking about a cache is being made, in my view, because people know it'll need some sort of caching to overcome 802.11/etc temporary bandwidth issues (Hmm. Kind of. You guys do know we're talking about way less bandwidth requirements than a DVD right - and that a DVD-formatted MPEG2 will transmit realtime on an 802.11g link? What's more, for 99% of Internet users, their DSL connection has less bandwidth than their wireless link, even if they're on the other side of the house with someone else's WAN in range and on the same channel. Yes, 802.11 suffers drop-outs, but we're talking about needing seconds worth of video effected, not hours) As such, you're trying to find evidence that it'll deal with caching.
YOU DON'T NEED TO. A few megabytes of RAM is enough to ensure smooth playback will happen. This is a non-problem. Everyone who's going this route is putting way too much thought into designing a solution to something that isn't hard to solve.
Nonetheless, because it's an "issue", everything is being interpreted in that light. If there's "storage", it must be because of caching! Well, in my opinion, if there's storage, it's almost certainly to do with storage. You don't need it for caching.
I'm trying to imagine a conversation with Bob Iger where the issue of flash or hard disk space for caching content to avoid 802.11 issues would come up, and where the word "storage" would be used purely in that context. It's hard. I don't see them talking about caches to Iger. It makes no sense. They might just as well talk about DCT transforms or the Quicktime API.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
Sure. I'm perfectly willing to be wrong too. I'm certainly less sure of it than I am of the iPhone rumours being bunk.
Regardless of the truth, I have to say the iTV makes little sense unless, regardless of whether it contains a hard disk or not, it can stream content directly from the iTS. Without the possibility of being used as a computer-less media hub, it becomes an overly expensive and complicated solution for what could more easily be done by making a bolt-on similar to that awful TubePort concept.
I'm 99% sure the machine is intended as an independent hub that can use iTunes libraries on the same network but can also go to the iTS directly and view content straight from there (and possibly other sources, such as Google Video.) I can see why Apple would make that. I can see why it would take a $300 machine to do that and make it practical. I see the importance of the iTS and the potential dangers to it as the cellphone displaces the iPod, and Apple's need to shore it up. I can see studio executives "not getting it" with online movies if those movies can only be seen on laptops, PCs, and iPods.
If Apple does force the thing to need a computer, I think they need to come out with an 'iTunes server' box that can fufill the same role, and it has to be cheap.
balamw
Sep 12, 07:21 PM
Here's another pic from the event today, taken by the Gizmodo guys...
Looking at their other pictures answered a question I was wondering. Does this thing have an Ethernet port, and it apparently does. I'd rather not rely on wireless. Right now I have a VGA cable from my iMac to my TV, so I'd gain something by replacing it with a simple CAT5.
I'm a bit surprised not to see any USB or FW ports on there though. I was betting on being able to hook up an optional HDD.
B
Looking at their other pictures answered a question I was wondering. Does this thing have an Ethernet port, and it apparently does. I'd rather not rely on wireless. Right now I have a VGA cable from my iMac to my TV, so I'd gain something by replacing it with a simple CAT5.
I'm a bit surprised not to see any USB or FW ports on there though. I was betting on being able to hook up an optional HDD.
B
javajedi
Oct 11, 08:48 AM
Originally posted by ddtlm
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
javajedi:
Admittedly I am getting lost in what all the numbers people have mentioned are for, but looking at these numbers you have here and assuming that they are doing the same task, you can rest assured that the G3/G4 are running far inferior software. AltiVec and SSE2 or not, there is just nothing that can explain this difference other than an unfair playing field. There is no task that a P4 can do 11x or 12x the speed of a G4 (comparing top-end models here). The P4 posseses nothing that runs at 11x or 12x the speed. Not the clock, not the units, the bandwidth to memory and caches are not 11x or 12x as good, it is not 11x better at branch prediction. I absolutely refuse to accept these results without very substantial backing because they contradict reality as I know it. I know a lot about the P4 and the G4, and I know a lot about programming in a fair number of different languages, even some assembly. I insist that these results do not reflect the actual performance of the processors, until irrefutable proof is presented to show how they do.
I guess the 70 and 90 don't surprise me a lot for the G3/G4, depending on clock speed difference. But all this trendy wandwagon-esque G4-bashing is not correct just cause every one else is doing it. There are things about the G3 that are very nice, but the G4 is no slouch compared to it, and given the higher clock that it's pipeline allows, the G3 really can't keep up. The G4 not only sports a better standard FPU, but it also sports better integer units.
Keep in mind this test does not reflect balanced system performance. The point of this exercise has been to determine how the G4's FPU compares to an assortment of different processors and operating systems.
I'd like to know you you qualify "inferior software" on the x86. If the P4 is some how cheating, then all of the other processors are cheating as well. Again, we ran the exact same code. We even made it into C code on the mac for maximum speed. In fact I'd like for you to check the code out for yourself, so you can see there is no misdirection here. Keep in mind, other people here have ran it on Athlons in Linux and still get sub 10 second times. I've also had a friend of mine (who i can trust) run it under Yellow Dog on a G4, he got 100+ seconds. And I did not tell him the scores we've been getting on the Mac, I had him run the test first and tell me how long it took before I even said anything. The JRE and now Mac OS X have been factored out of this equation.
When you look at operations like these, for example scalar integer ops, that's all register. The fsb, bsb, or anything else doesn't matter. This is a direct comparison between the two units on the G4 vs everything else. Also, my question to you is, in what way are the integer and fpu units "better" in the G4? I did not build the chip so I can't say weather they are better or not better than those in the 750FX, but I can say I've ran a fair benchmark comparing the FPU on the G4 from everything to a P4, Athlon, C3, G3, different operating systems, on x86 Windows and Linux, and on the Mac, Mac OS X and Yellow Dog. The results are consistent across the board. What more "proof" do you want?
SimD
Apr 12, 11:00 PM
I'm out of this thread.
Avid/Final Cut bashing is useless. Both have their place in the industry. Heck, both are sometimes used together...
Same goes for Adobe. It has its uses.
No need to boast about one being better than the other. Coppola's editor uses Final Cut, the Coen Bros use Final Cut, Fincher's editor uses Final Cut. And a **** load of editors use Avid. Is one film better because of the editing software? Not in my eyes.
Anyway, the update looks promising. I'm excited.
Happy editing. Ciao.
Avid/Final Cut bashing is useless. Both have their place in the industry. Heck, both are sometimes used together...
Same goes for Adobe. It has its uses.
No need to boast about one being better than the other. Coppola's editor uses Final Cut, the Coen Bros use Final Cut, Fincher's editor uses Final Cut. And a **** load of editors use Avid. Is one film better because of the editing software? Not in my eyes.
Anyway, the update looks promising. I'm excited.
Happy editing. Ciao.
bpaluzzi
Apr 28, 08:48 AM
Those "servers": each server has two Intel Quad-Core Processors running at 50W, 24GB of memory and a 120GB disk drive. Sounds like a nicely packed PC doesn't it?
It doesn't take a smart person to prune information out to support their claim, while redacting information which doesn't. Why didn't you include the full spec?
"Weta Digital uses HP�s BladeSystem c7000 chassis with BL2x220 server modules, with redundant HP Virtual Connect networking modules, full HP redundant thermal logic power supplies and fans, redundant management modules, each server had two Intel L5335 50w processors, 24GB memory and a mixture of 60GB and 120GB hard disk drives."
Most definitely NOT PCs. Sorry, try again.
It doesn't take a smart person to prune information out to support their claim, while redacting information which doesn't. Why didn't you include the full spec?
"Weta Digital uses HP�s BladeSystem c7000 chassis with BL2x220 server modules, with redundant HP Virtual Connect networking modules, full HP redundant thermal logic power supplies and fans, redundant management modules, each server had two Intel L5335 50w processors, 24GB memory and a mixture of 60GB and 120GB hard disk drives."
Most definitely NOT PCs. Sorry, try again.
megfilmworks
Oct 8, 11:02 AM
When pigs fly.
dante@sisna.com
Sep 12, 07:01 PM
Ok, if you're SOOOOO thrilled, you've been living in a cave because you could've been doing that for years, there's nothing new here aside for an apple logo on the box... the EyeHome could do that for the last 3 years (no storage, with a remote, streaming from my mac over Wifi - the eyehome physically connected to the router, my Mac on Wifi) (http://www.elgato.com/index.php?file=products_eyehome ). And you're right, it's great... Too bad you still have to wait 6 months :P
Yes, except the point is the iTunes/Movie interface with EyeHome does not have. What is cool is you can now use BOTH!!!
And the HD capabilities of iTV exceed Eyehome.
Yes, except the point is the iTunes/Movie interface with EyeHome does not have. What is cool is you can now use BOTH!!!
And the HD capabilities of iTV exceed Eyehome.
G5isAlive
Mar 18, 07:36 AM
What exactly about "unlimited" don't people understand? Without limits.
actually there was a limit. single person. not tethering. anything else is in fact breaking the agreement.
actually there was a limit. single person. not tethering. anything else is in fact breaking the agreement.
EmilH
Apr 6, 12:01 PM
I switched about a year ago and don't regret anything. Apple have to screw up big time to make me switch back to windows:)
citizenzen
Apr 22, 09:38 PM
... if the person has an epiphany, and then reflects on what just occurred logically, it could still be called proof.
Proof sufficient for their own self, or for those they can convince of it.
Insufficient for those who require some form of evidence.
This same argument has been going on for thousands of years. No one has been able to provide tangible, testable proof that God exists.
No one.
Proof sufficient for their own self, or for those they can convince of it.
Insufficient for those who require some form of evidence.
This same argument has been going on for thousands of years. No one has been able to provide tangible, testable proof that God exists.
No one.
balamw
Apr 10, 08:20 PM
I'm not sure sure what you mean when you say "for the things it is good at." What do you mean? What things?
They've been all over this thread, but you've been focused on the negatives.
Mac hardware: Multi-touch gestures. Yes, some PCs have "multi-touch" trackpads, but none are as smooth (literally and operationally) as that on a MacBook. Macs generally value quietness. This is a plus for anyone who works with audio or requires concentration. Minimal fan noise and such. Magsafe. It's a dumb little thing, but I've dumped my laptop plenty of times with the power cord before. It's nice to know I have some protection and it's saved me many times. You pretty much have to try a unibody machine to feel how different they feel than a typical plastic OEM box. Whether it's in your bag, on your lap or on your desk they feel solid with no little pieces to break and fall off. While YMMV, the glass over the display has been great for me with kids who love to poke at the screen. A micro fiber cloth brings it back to mint condition. I've also gotten so used to the darn MBP keyboard that I had to get one for my iMac and also use an Apple KB on my desktop PC. (Sad I know).
OS X: Display PDF is built in. This allows all apps to generate PDFs trivially, WYSIWYG works far better than on Windows and the Preview.app tool can edit PDFs in ways that require tons of software on a PC. Expose. Spaces. Xcode. Each of these has a near equivalent on the PC, but for many of us the advantage is to OS X's implementation. If you want to develop iOS apps, you should really do that on a Mac. Time Machine. Not perfect, but really nice for unattended wireless backup. Unix inside. For those of us who are technical at any level or who also appreciate Linux it's nice to be able to have a fully functional Unix environment just under the surface. iTunes works 100% better under OS X than the Windows port. For those of use with large libraries that matters. System wide scripting. Most Mac OS X apps can be scripted using Applescript or automated using Automator.. It's far simpler and more pervasive than under Windows.
The whole package. Battery life. Mac laptops running OS X tend to last a whole lot longer than their Windows counterparts. Power management just works. (I've had tons of problems with start up, sleep, wake, hibernate, shut down, etc... in Windows for years, and I see it hasn't improved with my wife's one year old Lenovo from work). I've also had PC notebook batteries that won't even last a year, but have never had to replace a battery in any of my Macs.
That's just off the top of my head and what is important to me.
If you gave it a chance you might find something that is important to you. If you don't you certainly won't.
B
They've been all over this thread, but you've been focused on the negatives.
Mac hardware: Multi-touch gestures. Yes, some PCs have "multi-touch" trackpads, but none are as smooth (literally and operationally) as that on a MacBook. Macs generally value quietness. This is a plus for anyone who works with audio or requires concentration. Minimal fan noise and such. Magsafe. It's a dumb little thing, but I've dumped my laptop plenty of times with the power cord before. It's nice to know I have some protection and it's saved me many times. You pretty much have to try a unibody machine to feel how different they feel than a typical plastic OEM box. Whether it's in your bag, on your lap or on your desk they feel solid with no little pieces to break and fall off. While YMMV, the glass over the display has been great for me with kids who love to poke at the screen. A micro fiber cloth brings it back to mint condition. I've also gotten so used to the darn MBP keyboard that I had to get one for my iMac and also use an Apple KB on my desktop PC. (Sad I know).
OS X: Display PDF is built in. This allows all apps to generate PDFs trivially, WYSIWYG works far better than on Windows and the Preview.app tool can edit PDFs in ways that require tons of software on a PC. Expose. Spaces. Xcode. Each of these has a near equivalent on the PC, but for many of us the advantage is to OS X's implementation. If you want to develop iOS apps, you should really do that on a Mac. Time Machine. Not perfect, but really nice for unattended wireless backup. Unix inside. For those of us who are technical at any level or who also appreciate Linux it's nice to be able to have a fully functional Unix environment just under the surface. iTunes works 100% better under OS X than the Windows port. For those of use with large libraries that matters. System wide scripting. Most Mac OS X apps can be scripted using Applescript or automated using Automator.. It's far simpler and more pervasive than under Windows.
The whole package. Battery life. Mac laptops running OS X tend to last a whole lot longer than their Windows counterparts. Power management just works. (I've had tons of problems with start up, sleep, wake, hibernate, shut down, etc... in Windows for years, and I see it hasn't improved with my wife's one year old Lenovo from work). I've also had PC notebook batteries that won't even last a year, but have never had to replace a battery in any of my Macs.
That's just off the top of my head and what is important to me.
If you gave it a chance you might find something that is important to you. If you don't you certainly won't.
B
awmazz
Mar 14, 12:27 PM
This here page, fwiw (http://week.manoramaonline.com/cgi-bin/MMOnline.dll/portal/ep/contentView.do?contentId=8976200&programId=1073754912&pageTypeId=1073754893&contentType=EDITORIAL), says the carrier RR was exposed to thirty days radiation in an hour. There are more than 700 hours in a month. You do the math.
2 years exposure a day = 730 years worth of normal background exposure per annum. That's okay then, not as bad as I first calculated. No breast cancer there. Bring the pregnant women in. I'll drink milk from that cow, eat eggs from them chickens. We all get that flying a plane. Not.
2 years exposure a day = 730 years worth of normal background exposure per annum. That's okay then, not as bad as I first calculated. No breast cancer there. Bring the pregnant women in. I'll drink milk from that cow, eat eggs from them chickens. We all get that flying a plane. Not.
Bill McEnaney
Mar 27, 04:10 PM
It isn't fallacious when the source is known to be unreliable and non representative of the field which they purport to be a part of.
But no one here has proved that Nicolosi is an unreliable representative of his field. If someone proves that Nicolosi is mistaken, maybe no one will need to attack him.
During this thread, I've just read an emotionally charged post that doesn't prove anything that the poster says about Nicolosi. I try to feel plenty of empathy. But if others keep attacking someone who disagrees with them, the attackers don't evoke my empathy. They decrease their credibility.
But no one here has proved that Nicolosi is an unreliable representative of his field. If someone proves that Nicolosi is mistaken, maybe no one will need to attack him.
During this thread, I've just read an emotionally charged post that doesn't prove anything that the poster says about Nicolosi. I try to feel plenty of empathy. But if others keep attacking someone who disagrees with them, the attackers don't evoke my empathy. They decrease their credibility.
Tilpots
Oct 7, 11:52 AM
Now that Android is coming to Verizon (http://forums.macrumors.com/showthread.php?t=798678) and they will be collaborating on handsets, I have no doubt Android will surpass the iPhone in terms of user numbers. Will it surpass in quality? That remains to be seen...
pdjudd
Oct 8, 09:19 AM
...but who has the market share?
In smart phones? I believe Nokia and RIM are the big ones - and they are both vendors that have a high degree of control over the software and hardware. On the desktop market it clearly is MS, but it's not really accurate to say that they got that way due to availability on every hardware system under the sun. Microsoft's successes are due to bulding up from prior successes. No surprise their biggest success was practically given to them by a bone headed decision by IBM.
RIM is one proof that you can get tons of market share even when you control the whole widget to a high degree. The second component is having enough SKU's to accommodate different needs. Of course it can become very unwieldy very quickly.
Google's biggest problem is avoiding the pitfalls that Microsoft fell into - trying to have a product that does everything in a market that tends to have difficulty in making choices. Either you get it right and maintain it with a focused plan, or you just release a new product every few months and see if it sticks somewhere.
We cannot say that Google will succeed with this strategy simply because we have a hard time predicting how it will happen - there are too many players vigorously competing. We don;t have an situation like the desktop market where an IBM mentality of thinking can just hand over the market to Google. Just because you attach "Google" and "Open" to something doesn't mean that it's going to succeed. And even if it does, succeed, it could be for a different reason altogether.
If I was a gambling person, I would say that ranking isn't going to be the factor to look at since all the contenders are going to be really close to each other - its not going to matter if "Google is in Second" because they will have to contend with a market where they can go to third in 6 months. In other workds - its who can do the best at leveraging one success into another - and in a market such as this - anybody can do that.
In smart phones? I believe Nokia and RIM are the big ones - and they are both vendors that have a high degree of control over the software and hardware. On the desktop market it clearly is MS, but it's not really accurate to say that they got that way due to availability on every hardware system under the sun. Microsoft's successes are due to bulding up from prior successes. No surprise their biggest success was practically given to them by a bone headed decision by IBM.
RIM is one proof that you can get tons of market share even when you control the whole widget to a high degree. The second component is having enough SKU's to accommodate different needs. Of course it can become very unwieldy very quickly.
Google's biggest problem is avoiding the pitfalls that Microsoft fell into - trying to have a product that does everything in a market that tends to have difficulty in making choices. Either you get it right and maintain it with a focused plan, or you just release a new product every few months and see if it sticks somewhere.
We cannot say that Google will succeed with this strategy simply because we have a hard time predicting how it will happen - there are too many players vigorously competing. We don;t have an situation like the desktop market where an IBM mentality of thinking can just hand over the market to Google. Just because you attach "Google" and "Open" to something doesn't mean that it's going to succeed. And even if it does, succeed, it could be for a different reason altogether.
If I was a gambling person, I would say that ranking isn't going to be the factor to look at since all the contenders are going to be really close to each other - its not going to matter if "Google is in Second" because they will have to contend with a market where they can go to third in 6 months. In other workds - its who can do the best at leveraging one success into another - and in a market such as this - anybody can do that.
JasperJanssen
Apr 30, 03:41 AM
That's been my observation in the business world as well. With projects often being Web-based now, Windows is becoming irrelevant. On one project with about twenty developers, systems architects and analysts, close to half were running Macbook Pros (no Windows installed) and doing very well. It's just not an issue for many office folks. Obviously there are some roles that still require Windows, but not as many as it used to be. The tech folks in particular seem to take great delight in moving to Macs. Times have changed.
Don't forget the joys of Virtualisation, and especially virtualisation where just the contents of a window from a VM are ported to a window on the host OS.
With macbook pros cheaply upgradable to 8 gigs and quad-core CPUs there's nothing stopping you from running all three major OSes simultaneously.
Don't forget the joys of Virtualisation, and especially virtualisation where just the contents of a window from a VM are ported to a window on the host OS.
With macbook pros cheaply upgradable to 8 gigs and quad-core CPUs there's nothing stopping you from running all three major OSes simultaneously.
amac4me
Jul 12, 08:58 AM
Oh yeah, these babies will fly. Looking to replace my 2004 PowerMac G5 Dual 2.5
Bring it on :D
Bring it on :D
ChrisA
Sep 26, 01:40 AM
So say I�m using my 8-core Mac Pro for CPU intensive digital audio recording. Would I be able to assign two cores the main program, two to virtual processing........
That is not the way it's done. One does not asign threads to cores. What yu do is crate threads and let the operating system shedle cores to "ready" threads
That is not the way it's done. One does not asign threads to cores. What yu do is crate threads and let the operating system shedle cores to "ready" threads
drsmithy
Sep 26, 09:17 PM
I snipped nothing.
The specific examples I refer to are putting applications in RAM, wherever that ram might be (ramdisc of main memory, ram based solid state drive on the drive bus, or memory drive on the graphics bus). Some applications greatly benefit from residing in RAM, such as compilers or image manipulators. Photoshop uses alot of swap space so you would need large ramdrives to benefit. I mainly am an advocate of ramdrives and see them underused in applications that would clearly benefit. Apple could gain some marketing points by simply offering such an option then bragging about it on TV of how a Mac is 20x as fast as a (stock) Dell :)
Rocketman
On modern platforms, the OS will "cache" (in reality it's a bit more complicated, but the effect is the same) the executable(s) and library(/ies) necessary for an application to execute at runtime and keep them in RAM unless the system is memory starved. As such, the only thing a RAM drive should speed up on a modern system is initial program load times.
RAM drives are (outside of corner cases like, say, for something like DB rollback logs) a crutch for systems with either insufficient real RAM (in which you should get more and let every aspect of the system benefit) or broken VM systems (in which case you should upgrade your OS and let every application benefit). Many of the methods you might have used to make your Mac II running System 7 faster don't really apply to modern OSes - RAM drives are one of them.
The specific examples I refer to are putting applications in RAM, wherever that ram might be (ramdisc of main memory, ram based solid state drive on the drive bus, or memory drive on the graphics bus). Some applications greatly benefit from residing in RAM, such as compilers or image manipulators. Photoshop uses alot of swap space so you would need large ramdrives to benefit. I mainly am an advocate of ramdrives and see them underused in applications that would clearly benefit. Apple could gain some marketing points by simply offering such an option then bragging about it on TV of how a Mac is 20x as fast as a (stock) Dell :)
Rocketman
On modern platforms, the OS will "cache" (in reality it's a bit more complicated, but the effect is the same) the executable(s) and library(/ies) necessary for an application to execute at runtime and keep them in RAM unless the system is memory starved. As such, the only thing a RAM drive should speed up on a modern system is initial program load times.
RAM drives are (outside of corner cases like, say, for something like DB rollback logs) a crutch for systems with either insufficient real RAM (in which you should get more and let every aspect of the system benefit) or broken VM systems (in which case you should upgrade your OS and let every application benefit). Many of the methods you might have used to make your Mac II running System 7 faster don't really apply to modern OSes - RAM drives are one of them.
No comments:
Post a Comment