DeathChill
Apr 21, 07:53 AM
Ouch, it must really have hurt Apple that Android *smartphones* outsold all Apple iOS *devices* worldwide in Q1 (40 million Android smartphones compared to 32 million iOS devices). So they now are making again strange comparisons that only cover *one* market and *phones* vs. *devices.
Any links for that claim?
Also, Apple doesn't make the charts; I don't get how it's strange to compare a platform to another platform. I think it's stranger to compare a single device to an entire platform.
Any links for that claim?
Also, Apple doesn't make the charts; I don't get how it's strange to compare a platform to another platform. I think it's stranger to compare a single device to an entire platform.
robbieduncan
Mar 13, 03:50 PM
None of the studies I have read proposing this, have suggested the sort of ecological impact you are implying. This is pure, unadulterated, BS.
Indeed. Some existing solar arrays are built on grazing land that is still productive grazing once the array is in place.
Indeed. Some existing solar arrays are built on grazing land that is still productive grazing once the array is in place.
rikers_mailbox
Sep 20, 03:03 AM
If Iger is correct and iTV has a hard drive.. then I beleive iTV could serve as an external iTunes Library server/device. Authorized computers can access and manage it using iTunes (running as a client). iTS downloads, podcasts, imported physical CDs, etc would all be stored on iTV.
Look at your hard drive usage, Music takes up a significant amount of it. Why does it need to be kept on your local machine if iTV provides a network?
Look at your hard drive usage, Music takes up a significant amount of it. Why does it need to be kept on your local machine if iTV provides a network?
Lesser Evets
Apr 28, 07:35 AM
Almost all of that is due to the iPad. They had around 4% of the global market for computers last year.
And growth is bad?
And growth is bad?
Octobot
Oct 30, 10:46 AM
If I was running upcomming Leopard OSX, a few osx apps, the full upcoming CS3 Suite (not necessarily Batch Processing), have After Effects rendering a 30 minute clip in the background, downloading *legal torrents, watching internet tv (muted), while burning a DVD and listening to music..
That keeping in mind I won't necessariy be rendering-multiple scenes, while encoding, batch processing with a multiple of applications while running SETI@home ;) .... yet
Would that kind of Multi-tasking benefit through Multi-threading on the Octobot's 8-Cores..
Or slighly / not significant enough to warrant Going Octo over Quad..
thx in advance,
L
That keeping in mind I won't necessariy be rendering-multiple scenes, while encoding, batch processing with a multiple of applications while running SETI@home ;) .... yet
Would that kind of Multi-tasking benefit through Multi-threading on the Octobot's 8-Cores..
Or slighly / not significant enough to warrant Going Octo over Quad..
thx in advance,
L
HyperX13
Apr 28, 09:05 PM
Apple is turning it's back on enterprise! But who knows, maybe smart strategy!
Bill McEnaney
Apr 23, 12:20 AM
I don't Know what type of Atheists you meet, but most of those in this forum(theists too :D) DO argue their beliefs and do not expect them to go unchecked.
Unchecked in what sense of the word "unchecked?"
Unchecked in what sense of the word "unchecked?"
wolfshades
Apr 15, 09:37 AM
This is an excellent initiative. Bullying goes on beyond high school and college too. You see it everywhere. There are parts of our cities where it's just unsafe for any of them to go walking alone, just because of how their sexuality is perceived by the ignorant and thuggish class. I think that's sad - clearly there's still a long road ahead.
Good on Apple employees - and all others who partnered in this initiative - for speaking up.
Maybe the next generation will be the one that shrugs its shoulders when discussion of sexual orientation comes up, like it's no big deal, because no one really sees it as a major social issue anymore. Maybe then the bullying will stop, having lost a target.
Good on Apple employees - and all others who partnered in this initiative - for speaking up.
Maybe the next generation will be the one that shrugs its shoulders when discussion of sexual orientation comes up, like it's no big deal, because no one really sees it as a major social issue anymore. Maybe then the bullying will stop, having lost a target.
AppliedVisual
Oct 21, 12:42 PM
I'm Speechless. All I can think of is "Wow!"
Makes 20" 1600 x 1200 look puny and the 24" 1920 x 1200 modest.
Yep. Now that I've gone with the 30", I feel so cramped on anything smaller. The dual 30" config is awesome... More than enough space to leave all kinds of stuff accessible - it's insanely wonderfully cool.
...Which brings up my little learning experience over the past couple days. I fired up my 30" as the second display on may G5 quad and all was well. But I was starting to have second thoughts about crowding my desk at home. I packed it back up and took it to the office, plugged it in. Came right up, but I couldn't set the resolution on it to anything higher than 1280x800. Hmmm.... Both had the same video card, (or so I thought), both were the same system, the one at the office was manufactured 12/05, the one at home was 10/05. So I try some different software re-installs and whatnot can't figure it out. so I jump online and research until I'm blue... The 7800GT only has a single dual-link DVI port. Weird, I thought it had two? So I packed the monitor back up, took it home to see what was up... Before plugging it into my quad at home, I started to move the system to open it up and noticed the extra fan openign next to the DVI connectors and the round mini-din style connector. WTF! So I popped the lid real quick to make sure I wasn't hallucinating. This system has the FX4500 and I never even noticed until now. I guess I never checked. :o I had to dig out my invoice, it was a refurbished system I bought from a local dealer -- system was a lease return that made it back to them after only 3 months. It supposedly had the 7800GT in it, but nope - FX4500.
Lucky me. :D My resale value on this system just went way up. ;)
How do I look for dead pixels AppliedVisual? Yes I want two. :)
Two kinds of bad pixels usually show on LCD monitors. Dead pixels are pixels that are black and won't do anything, somewhat rare, really. Stuck pixels are pixels where one of the R, G or B elements is "stuck" at a certain color value and won't change. Typically stuck pixels are stuck full-on and will stand out against dark backgrounds. The best way to check for them is to run a full-screen game or program that can show a black background, other color backgrounds can be helful at times too. Stuck pixels will be visible pixels in these situations. Usually, you'll see them when they show up as they do tend to stand out against contrasting backgrounds. Other types of anomalies on these displays are white pixels or sparkles, which can either be static like a dead/stuck pixel or they can move or come and go. These are usually caused by a poor video signal or too much power over the video interface. Sometimes can even be a faulty GPU. Multi-component pixels - where more than just one R, G or B component is stuck on at the same pixel location are often a faulty GPU. But sparkels and multi-component pixels can still be a defective display... I ordered a Dell notebook for an employee a couple years ago and it arrived with hundreds of stuck/multicomp. pixels all around the screen edges. Dell swapped it out, but I know it was caused by the system sitting on a loading dock or in a truck overnight when it got to -25F here. The LCD screen literally froze all around the edges causing irrepairable damage!
The 30" makes such a huge difference in managing windows of different applications simultaneously. I can see why you wanted 2 AV. Tell me, is there a significant improvement inthe design of your 3007 vs the 3005
AFAIK, there never was a 3005 model, only the 3007. Dell didn't announce their 30" display until last December. I ordered mine on Christmas Eve last year and received it the first week of January. It's a 3007 model as well, Rev.A00. The new one is Rev.A02. Both are identical except I find the old one to have a slight tint to the whites. I had to tweak the color profile for the old one a bit to match the new one, but now it's fine. I don't know if it's a difference in revisions or just normal variation between models or what. The difference is slight, and is only noticeable when the two are side by side, which they are. :D On the bright side, with that Dell forum coupon, my new one was nearly $1K cheaper than the first one.
Makes 20" 1600 x 1200 look puny and the 24" 1920 x 1200 modest.
Yep. Now that I've gone with the 30", I feel so cramped on anything smaller. The dual 30" config is awesome... More than enough space to leave all kinds of stuff accessible - it's insanely wonderfully cool.
...Which brings up my little learning experience over the past couple days. I fired up my 30" as the second display on may G5 quad and all was well. But I was starting to have second thoughts about crowding my desk at home. I packed it back up and took it to the office, plugged it in. Came right up, but I couldn't set the resolution on it to anything higher than 1280x800. Hmmm.... Both had the same video card, (or so I thought), both were the same system, the one at the office was manufactured 12/05, the one at home was 10/05. So I try some different software re-installs and whatnot can't figure it out. so I jump online and research until I'm blue... The 7800GT only has a single dual-link DVI port. Weird, I thought it had two? So I packed the monitor back up, took it home to see what was up... Before plugging it into my quad at home, I started to move the system to open it up and noticed the extra fan openign next to the DVI connectors and the round mini-din style connector. WTF! So I popped the lid real quick to make sure I wasn't hallucinating. This system has the FX4500 and I never even noticed until now. I guess I never checked. :o I had to dig out my invoice, it was a refurbished system I bought from a local dealer -- system was a lease return that made it back to them after only 3 months. It supposedly had the 7800GT in it, but nope - FX4500.
Lucky me. :D My resale value on this system just went way up. ;)
How do I look for dead pixels AppliedVisual? Yes I want two. :)
Two kinds of bad pixels usually show on LCD monitors. Dead pixels are pixels that are black and won't do anything, somewhat rare, really. Stuck pixels are pixels where one of the R, G or B elements is "stuck" at a certain color value and won't change. Typically stuck pixels are stuck full-on and will stand out against dark backgrounds. The best way to check for them is to run a full-screen game or program that can show a black background, other color backgrounds can be helful at times too. Stuck pixels will be visible pixels in these situations. Usually, you'll see them when they show up as they do tend to stand out against contrasting backgrounds. Other types of anomalies on these displays are white pixels or sparkles, which can either be static like a dead/stuck pixel or they can move or come and go. These are usually caused by a poor video signal or too much power over the video interface. Sometimes can even be a faulty GPU. Multi-component pixels - where more than just one R, G or B component is stuck on at the same pixel location are often a faulty GPU. But sparkels and multi-component pixels can still be a defective display... I ordered a Dell notebook for an employee a couple years ago and it arrived with hundreds of stuck/multicomp. pixels all around the screen edges. Dell swapped it out, but I know it was caused by the system sitting on a loading dock or in a truck overnight when it got to -25F here. The LCD screen literally froze all around the edges causing irrepairable damage!
The 30" makes such a huge difference in managing windows of different applications simultaneously. I can see why you wanted 2 AV. Tell me, is there a significant improvement inthe design of your 3007 vs the 3005
AFAIK, there never was a 3005 model, only the 3007. Dell didn't announce their 30" display until last December. I ordered mine on Christmas Eve last year and received it the first week of January. It's a 3007 model as well, Rev.A00. The new one is Rev.A02. Both are identical except I find the old one to have a slight tint to the whites. I had to tweak the color profile for the old one a bit to match the new one, but now it's fine. I don't know if it's a difference in revisions or just normal variation between models or what. The difference is slight, and is only noticeable when the two are side by side, which they are. :D On the bright side, with that Dell forum coupon, my new one was nearly $1K cheaper than the first one.
MrMacMan
Oct 9, 06:55 PM
True that macs are overpriced but you do gain the operating system which kicks micrsoft xp sh*tless. They don't have the apps and other wounderful features.
As for performance we have lost in most catorgies due to, maybe companyies not writing code for the G4 altevic (sp?).
For many reasons Pc's have taken the lead in market share for a while now.
They have many choices, dell, gateway, and tons of other brands along with the possibality of Makeing Your Own.
Apple has: Apple for the OS
Apple for many of the Apps.
IBM/Motorola for the low clock speed processors.
Compared to the PC side:
Microsoft for the OS (mostly, linux users)
Microsoft and Many other fo apps.
Intel or AMD for nice processors...
We have the dis-advantage, for many of these factors...
Still many of us fight on for the better computer, and to fight off the world of monopoliyes.
As for performance we have lost in most catorgies due to, maybe companyies not writing code for the G4 altevic (sp?).
For many reasons Pc's have taken the lead in market share for a while now.
They have many choices, dell, gateway, and tons of other brands along with the possibality of Makeing Your Own.
Apple has: Apple for the OS
Apple for many of the Apps.
IBM/Motorola for the low clock speed processors.
Compared to the PC side:
Microsoft for the OS (mostly, linux users)
Microsoft and Many other fo apps.
Intel or AMD for nice processors...
We have the dis-advantage, for many of these factors...
Still many of us fight on for the better computer, and to fight off the world of monopoliyes.
Squire
Sep 20, 07:45 AM
To those that say that Apple won't allow this because it would hit their own TV show revenues from the iTunes store... I disagree. They'll have to give in sooner or later, because EyeTV isn't going to go away. Would iTunes/iPod have been such a success if they'd have made us purchase all our music from iTunes, even the stuff we alread had on CD?
I'm not going to pay �3 (or whatever) for an Episode of Lost if I could have recorded on EyeTV last night... especially when C4 repeat each episode about 6 times per week anyway.
I see your point but maybe you're not seeing the big picture-- the future as Apple, perhaps, sees it. (And you are paying for that "Lost" episode whether you watch it or not, aren't you?)
A few minutes ago, I was thinking, Gee...if Apple got enough content on iTunes, a guy could just buy all the stuff he wanted to see and to hell with the rest. I see this as replacing cable TV in the not-too-distant future. Customized, commercial-free TV delivered to your computer and then sent to your iTV box. Why pay for that afternoon soap opera that you never watch?
This model probably would not make financial sense for people who watch a lot of TV but, for those who only watch a select few shows, it might be a good alternative to cable TV.
-Squire
I'm not going to pay �3 (or whatever) for an Episode of Lost if I could have recorded on EyeTV last night... especially when C4 repeat each episode about 6 times per week anyway.
I see your point but maybe you're not seeing the big picture-- the future as Apple, perhaps, sees it. (And you are paying for that "Lost" episode whether you watch it or not, aren't you?)
A few minutes ago, I was thinking, Gee...if Apple got enough content on iTunes, a guy could just buy all the stuff he wanted to see and to hell with the rest. I see this as replacing cable TV in the not-too-distant future. Customized, commercial-free TV delivered to your computer and then sent to your iTV box. Why pay for that afternoon soap opera that you never watch?
This model probably would not make financial sense for people who watch a lot of TV but, for those who only watch a select few shows, it might be a good alternative to cable TV.
-Squire
CalBoy
Apr 23, 05:45 PM
I don't think many people say they're Catholic to fit in or be trendy... Maybe Jewish, but definitely not Catholic.
How do people make atheism "trendy?"
The very notion of making critical thinking subject to blind fanaticism is contradictory.
I've concluded American Atheists who are continually challenged on their beliefs and "surrounded by enemies" are more likely to read into atheism and all it entails, rather like a convert to a religion knows the religion better than people who were born into it. Europe is very secular, compared to the US at least, and thus a lot of people are "born into" atheism/secularism.
Have you spoken to people born into an atheist household? What evidence do you have to back up this claim? It certainly isn't what I've seen, and it runs counter to who atheists (and more specifically atheist parents) are.
Europeans, moreover, consistently out-perform Americans in scientific literacy. Even if Europeans are being born into atheism, it doesn't seem to have negatively affected their knowledge of the relevant facts (quite the contrary, in fact).
You can use pure reason, that's what many of the early church fathers did to try and prove God's existence, via the various famous arguments, and of course later philosophers too. Sometimes the nature of God changes to help him fit into a scheme, like Spinoza's pantheism where he argues God and nature are one and the same, and we exist in God as we exist in nature. For Spinoza God is like a force rather than a sentient being.
I should have put it better: it isn't possible to use pure reason to prove a deity without committing a host of logical fallacies and/or relying on false presumptions.
If you think you can do this, post your argument and let it be put to the test.
A lot of people seem to entertain this notion that theists don't use any sort of logic or reason to ground their faith but they do. God has to fit a framework (the Judaeo-Christian God, not the God of islam which the qur'an itself says is arbitrary and unknowable because it can do whatever it wants). The problem is that faith is required to take those extra few steps into fully fledged belief because there can't, at the moment, be any conclusive proof one way or another (although theists are getting more clever and appropriating physical principles to try and help them explain God, such as Entropy and thermodynamics).
It isn't really logic if you're building faith into your reasoning structure. The "framework" is really just one opinion on the matter. I could conceive of a god that uses a different framework entirely, and it would be just as valid as any existing religion's. All religion ultimately boils down to one consistent rule: Trust us.
If someone told us a hundred or so years ago that photons can communicate with one another despite being thousands of miles apart we would call that supernatural, but as time goes on the goal posts are moved ever further.
First of all, photons do not communicate. Humans manipulate them for the purposes of communication. It's no more accurate to say that photons communicate than it is to say that paper does.
Secondly, moving the goal posts is precisely the problem with religion. It's very easy to be "right" if you always mean something different when your prior statement is proved categorically false.
The point really is that after debunking supernatural beliefs for so long, we shouldn't really stand by any one of them without some evidence. God is no different. Without evidence, the idea is just as absurd as believing that killing a young virgin every spring will result in a bountiful harvest. Religion gets a free pass because the indoctrination occurs early, often, and with a very large bankroll.
How do people make atheism "trendy?"
The very notion of making critical thinking subject to blind fanaticism is contradictory.
I've concluded American Atheists who are continually challenged on their beliefs and "surrounded by enemies" are more likely to read into atheism and all it entails, rather like a convert to a religion knows the religion better than people who were born into it. Europe is very secular, compared to the US at least, and thus a lot of people are "born into" atheism/secularism.
Have you spoken to people born into an atheist household? What evidence do you have to back up this claim? It certainly isn't what I've seen, and it runs counter to who atheists (and more specifically atheist parents) are.
Europeans, moreover, consistently out-perform Americans in scientific literacy. Even if Europeans are being born into atheism, it doesn't seem to have negatively affected their knowledge of the relevant facts (quite the contrary, in fact).
You can use pure reason, that's what many of the early church fathers did to try and prove God's existence, via the various famous arguments, and of course later philosophers too. Sometimes the nature of God changes to help him fit into a scheme, like Spinoza's pantheism where he argues God and nature are one and the same, and we exist in God as we exist in nature. For Spinoza God is like a force rather than a sentient being.
I should have put it better: it isn't possible to use pure reason to prove a deity without committing a host of logical fallacies and/or relying on false presumptions.
If you think you can do this, post your argument and let it be put to the test.
A lot of people seem to entertain this notion that theists don't use any sort of logic or reason to ground their faith but they do. God has to fit a framework (the Judaeo-Christian God, not the God of islam which the qur'an itself says is arbitrary and unknowable because it can do whatever it wants). The problem is that faith is required to take those extra few steps into fully fledged belief because there can't, at the moment, be any conclusive proof one way or another (although theists are getting more clever and appropriating physical principles to try and help them explain God, such as Entropy and thermodynamics).
It isn't really logic if you're building faith into your reasoning structure. The "framework" is really just one opinion on the matter. I could conceive of a god that uses a different framework entirely, and it would be just as valid as any existing religion's. All religion ultimately boils down to one consistent rule: Trust us.
If someone told us a hundred or so years ago that photons can communicate with one another despite being thousands of miles apart we would call that supernatural, but as time goes on the goal posts are moved ever further.
First of all, photons do not communicate. Humans manipulate them for the purposes of communication. It's no more accurate to say that photons communicate than it is to say that paper does.
Secondly, moving the goal posts is precisely the problem with religion. It's very easy to be "right" if you always mean something different when your prior statement is proved categorically false.
The point really is that after debunking supernatural beliefs for so long, we shouldn't really stand by any one of them without some evidence. God is no different. Without evidence, the idea is just as absurd as believing that killing a young virgin every spring will result in a bountiful harvest. Religion gets a free pass because the indoctrination occurs early, often, and with a very large bankroll.
Hikkadwa
Apr 13, 02:24 AM
Based on the screenshots -This looks like its another car crash bit of software. I bet the guy who destroyed iMovie 06 has something to do with this. Lets just hope I'm wrong.
awmazz
Mar 13, 11:45 AM
This is what I dislike. Not to get all political here, but alternative energy, however nice, is nowhere even close to providing the power we need. Windmills cannot ever meet energy demand; we're talking about a 5% fill if we put them everywhere. They're also too costly at this point for their given power output. Solar energy, though promising, still has a piss poor efficiency, and thus isn't ready for prime usage for some time. There's really no other alternatives.
And this is what I dislike about the pro-nuclear rhetoric. This is not true at all. Geo thermal energy. Cleaner, cheaper, safer than nuclear by magnitudes.
A nuclear power station is just a steam turbine fueled by poisonous rocks instead of carbonized trees as a heat source. I believe the iPad app version of Popular Science has an illustrated article about an test plant using geothermal heat instead to run steam turbines.
And this is what I dislike about the pro-nuclear rhetoric. This is not true at all. Geo thermal energy. Cleaner, cheaper, safer than nuclear by magnitudes.
A nuclear power station is just a steam turbine fueled by poisonous rocks instead of carbonized trees as a heat source. I believe the iPad app version of Popular Science has an illustrated article about an test plant using geothermal heat instead to run steam turbines.
Lord Blackadder
Mar 15, 07:29 PM
nuclear power hadn't got a long term future in germany before this event though. the discussion is only about the running time of existing nuclear plants (after all 6 reactors were originally destined to be shut down originally in the 2010-2013 time frame)
the politicking here will be that after the elections the reactors will be turned _on_ again .. against the will of the voting population
I don't know much about the situation, but it seems to me that if the reactors are already up and running, the majority of the environmental impact has already happened. Shutting them off now versus when the currently installed fuel rods are spent does not significaly reduce the environmental impact of the stations - all it does is take 7 gW out of the grid, energy that will presumably have to be made up through increased output from coal/gas/oil plants. However, as you said:
the question which comes up though is: if 7 nuclear plants can easily taken off the grid for 3 months without consequences to electricity supply... why exactly are they deemed so important ?
If they really can afford to take them off the grid, then why are they running? Perhaps they are sewlling the enegry to other countries and don't want to lose the revenue? Or maybe the German government is unwilling to remove a domestic power-producing option in favor of fuels they have to import from elsewhere?
An intersting situation.
A cold comfort considering it is now already thought to be close to a level 6 incident on the INES scale.
Yes, but you'd be saying the same thing regardless of where this incident fell on the INES scale, wouldn't you? As long as global energy consumption continues to grow, you'd better get used to living with nuclear power, because right now there just isn't an alternative. Turning off all the nuclear plants will put much heavier pressure on the oil, coal and gas industries, and that will have its own set of consequences.
the politicking here will be that after the elections the reactors will be turned _on_ again .. against the will of the voting population
I don't know much about the situation, but it seems to me that if the reactors are already up and running, the majority of the environmental impact has already happened. Shutting them off now versus when the currently installed fuel rods are spent does not significaly reduce the environmental impact of the stations - all it does is take 7 gW out of the grid, energy that will presumably have to be made up through increased output from coal/gas/oil plants. However, as you said:
the question which comes up though is: if 7 nuclear plants can easily taken off the grid for 3 months without consequences to electricity supply... why exactly are they deemed so important ?
If they really can afford to take them off the grid, then why are they running? Perhaps they are sewlling the enegry to other countries and don't want to lose the revenue? Or maybe the German government is unwilling to remove a domestic power-producing option in favor of fuels they have to import from elsewhere?
An intersting situation.
A cold comfort considering it is now already thought to be close to a level 6 incident on the INES scale.
Yes, but you'd be saying the same thing regardless of where this incident fell on the INES scale, wouldn't you? As long as global energy consumption continues to grow, you'd better get used to living with nuclear power, because right now there just isn't an alternative. Turning off all the nuclear plants will put much heavier pressure on the oil, coal and gas industries, and that will have its own set of consequences.
miniConvert
Oct 7, 06:21 PM
Android should easily surpass the iPhone in market share, IMHO. So what?
It's an OS written to run on a multitude of hardware and is/will be heavily customised by both manufacturers and operators. Due to this I doubt it'll ever match the iPhone for quality, while in terms of market share it should clean up.
It's an OS written to run on a multitude of hardware and is/will be heavily customised by both manufacturers and operators. Due to this I doubt it'll ever match the iPhone for quality, while in terms of market share it should clean up.
MacBoobsPro
Oct 26, 10:36 AM
16 cores in 2007
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
32 cores in 2008
64 cores in 2009
128 cores in 2010
You want to wait 'til 2010 at the soonest? :rolleyes:
4 years. Cant wait. My emailing exploits will just zip along.
How many chips would it span though?
Cyrax
Apr 6, 01:32 PM
What if I just want my top 10 favorites? In Windows I just drag the icon (of whatever I want) to the Start button, then drop it into the list of my favorites (I'm not sure of the actual term for this). Can this be done on a Mac?
Since I open the same 10 or 12 programs or folders or files many times throughout the day, every day, this is pretty important to me. It would absolutely mess up my work flow to lose this feature.
Those programs are the ones you would put on your Dock.
Since I open the same 10 or 12 programs or folders or files many times throughout the day, every day, this is pretty important to me. It would absolutely mess up my work flow to lose this feature.
Those programs are the ones you would put on your Dock.
puma1552
Mar 12, 01:28 AM
Guys,
Please stop speculating about the situation of the Japanese nuclear reactors, protocols, and regulations, or how they--those specific ones--work.
Unless you are an expert with a background in chemical/nuclear engineering, and an expert not only on just nuclear reactors but also Japanese nuclear regulations, then you aren't really in a place to criticize from halfway around the world. We derive 30% of our power from nuclear reactors, we know what we are doing. We aren't unnecessarily paranoid about nuclear power like the west is.
We know very little about the situation with the Japanese reactors, and even less about the reactors themselves.
Comparing them to the 30+ year old standards of the impoverished USSR is rather inappropriate.
Please stop speculating about the situation of the Japanese nuclear reactors, protocols, and regulations, or how they--those specific ones--work.
Unless you are an expert with a background in chemical/nuclear engineering, and an expert not only on just nuclear reactors but also Japanese nuclear regulations, then you aren't really in a place to criticize from halfway around the world. We derive 30% of our power from nuclear reactors, we know what we are doing. We aren't unnecessarily paranoid about nuclear power like the west is.
We know very little about the situation with the Japanese reactors, and even less about the reactors themselves.
Comparing them to the 30+ year old standards of the impoverished USSR is rather inappropriate.
i_am_a_cow
Mar 20, 01:53 PM
Yes.
Probably not, but are you going to whip out a check to pay for it? Software delevelopment is not free.
What a silly thought. Of course it's not free. I'm saying that it is just as unethical for Apple to ignore Linux as it is for DVD Jon to try and play music on Linux. We are not talking about what is technically wrong here. After all, every country has a different set of laws. We are talking about what is the right thing to do. It would hardly be a burden for Apple to port iTunes and open up Airport drivers.
The main concern of mine is Apple's stubborn refusal to adapt to simple standards. They haven't kept up with GNU standards in GCC, they won't port Quicktime or iTunes to Linux, they won't make open drivers available for Airport cards. Apple is losing quite a few fans. I was a huge Apple fan for a long time (3/4 of my life). Now, I am losing respect for Apple's ridiculous money-making stubborness.
And don't try and argue that Mac OS X is just the same as linux. It isn't.
Probably not, but are you going to whip out a check to pay for it? Software delevelopment is not free.
What a silly thought. Of course it's not free. I'm saying that it is just as unethical for Apple to ignore Linux as it is for DVD Jon to try and play music on Linux. We are not talking about what is technically wrong here. After all, every country has a different set of laws. We are talking about what is the right thing to do. It would hardly be a burden for Apple to port iTunes and open up Airport drivers.
The main concern of mine is Apple's stubborn refusal to adapt to simple standards. They haven't kept up with GNU standards in GCC, they won't port Quicktime or iTunes to Linux, they won't make open drivers available for Airport cards. Apple is losing quite a few fans. I was a huge Apple fan for a long time (3/4 of my life). Now, I am losing respect for Apple's ridiculous money-making stubborness.
And don't try and argue that Mac OS X is just the same as linux. It isn't.
Phil A.
Aug 29, 02:51 PM
The one thing that struck me on the report is the amount of marks given to companies who have committed to a timescale. For example, Apple have committed to removing all BFRs but given no timescale and are marked as "bad". Dell have committed to removing all BFRs by 2009 and are marked "Good". Don't get me wrong, it's good that companies are giving time scales, but they don't really mean jack until they're implemented (the UK committed to the Kyoto protocol and will miss it's commitments by miles), and I think it's a bit misleading to give any company full marks simply because they have given a date that may be missed. I would have preferred to see those marked as Partially Good because clearly a commitment isn't as good as actually delivering on promises.
THX1139
Jul 13, 02:40 AM
if you don't need all the power you can get the mac pro is not for you, apple does not do a consumer tower and most likely never will, they simply must have a quad settup and if they have two configs of them (a 3GHz and a 2.66) they may as well keep the low end option on the same platform, this has been said again and again and again, conroe is not bad it just does not make sense for apple to use it in the mac pro, conroe goes in the imac.
I wasn't saying that I don't need power, I just don't want to pay premium for quad processing with expensive overrated chips. And just because I don't want a Quad doesn't mean should be stuck with an iMac. I would be content with a Conroe running around 3GHZ in the currently shipping configurations. By your post, I get that you think the Conroe is for prosumer/home computers and the only "professional" level chip is Woodcrest. Apple has been shipping a mid-range G5 dual2.3 for quite awhile now. What's wrong with them shipping something similar with Conroe? Oh, wait... that would be wrong, because by your account, Conroe is NOT a professional chip. I disagree.
I wasn't saying that I don't need power, I just don't want to pay premium for quad processing with expensive overrated chips. And just because I don't want a Quad doesn't mean should be stuck with an iMac. I would be content with a Conroe running around 3GHZ in the currently shipping configurations. By your post, I get that you think the Conroe is for prosumer/home computers and the only "professional" level chip is Woodcrest. Apple has been shipping a mid-range G5 dual2.3 for quite awhile now. What's wrong with them shipping something similar with Conroe? Oh, wait... that would be wrong, because by your account, Conroe is NOT a professional chip. I disagree.
mac jones
Mar 12, 03:58 AM
Hey, I've been hanging out on the forum for the iPad. But frankly i'm a little confused right now about what i just saw. From appearances (I mean appearances), the nuke plant in Japan BLEW UP, and they are lying about it if they say it's a minor issue. I don't want to believe this . You can see it with your own eyes, but i'm not sure exactly what i'm seeing. Certainly it isn't a small explosion.
Until I know what's really happening I'm officially, totally, freaked out......Any takers? :D
Until I know what's really happening I'm officially, totally, freaked out......Any takers? :D
DrDomVonDoom
Apr 13, 01:29 PM
So basically what you are saying is that you are a two bit hack and a kid with just an ounce of creativity can easily replace you because any kid can afford a $300 program, whereas a $900 one keeps them artificially out of the game.
The really ironic thing about your post is that FCP 1.0 was a cost revolution itself bringing video editing to he masses for really the first time ever, which you took advantage of. Now that Apple is doing it again and you are at risk you seemingly outraged.
Try and get your facts right before spouting off and obviously you are no pro app user. Premier was before FCP and FCP was taken from premier as the person who built FCP was the same. Premier was the first cost revolution not FCP.1 as Macs didn't sell many at that point. It stands to reason that if you dilute something in price it will then be worth less, and in business you need a premium product to keep your head above water.. Its all very well Apple releasing garage band as this is ment for kids and individuals to play around with and when or if they decide to go and pursue this for a career they can up sell them to Logic or Pro Tools etc. This is a huge step up for that route, but what I am saying is this: If everyone has the same tools then how can it be called a pro app? The new FCP is pretty much based on Imovie and for those who dont except that try and use them both together and then you will see.
Take the Red camera.. this could sell for 5k and everyone would have one, so why would you pay a daily rate of $1500 to have someone use a camera that only costs $5k? Wake up and smell the coffee but as your post indicates you dont live in the real world as companies will pay more for something they feel is better than it really is. Its simple business logic and psychology. Companies pay a premium for a professional using professional gear not an app you download from the app store.
I think that is 'Professional' world that your living in is starting to change. Applications aren't just a forte of a few high and mighty code monkeys. For example I could go get Xcode off the App store and download it for 5 bucks, thats all you need to make a Killer iPhone app, 5 bucks. Angry Birds, made millions of dollers, and it started with 5 bucks. It could be used by a Fortune 500 company to create a in-shop app that can do much for the company, or it can be used by some kid in his room to create a game. This idea of there is a special elite out there is changing. Technology is embraced by everyone, and everyone born today will have the same oppertunity's to use them. Computers or Video Editing isn't just something that is done by geeks in a basement on some College campus using machines the size of desks. Its done by Granny's, Kiddo's, everyone. High Definition cameras are affordable to anyone with a little skill in saving. People aren't gonna need 'Professionals' forever. Why hire a Photographer for a wedding, when I can afford just as good Camera, photo editing software for less then it would be to hire them?
We can't keep professionals around just for the sake of keeping them around. If they are productive, if society needs them, then they will do fine. I'm sure your industry needs you, and plenty of regular joe's do to. But not forever, definatly not with the next generation of script kiddies and technology savvy teens.
The really ironic thing about your post is that FCP 1.0 was a cost revolution itself bringing video editing to he masses for really the first time ever, which you took advantage of. Now that Apple is doing it again and you are at risk you seemingly outraged.
Try and get your facts right before spouting off and obviously you are no pro app user. Premier was before FCP and FCP was taken from premier as the person who built FCP was the same. Premier was the first cost revolution not FCP.1 as Macs didn't sell many at that point. It stands to reason that if you dilute something in price it will then be worth less, and in business you need a premium product to keep your head above water.. Its all very well Apple releasing garage band as this is ment for kids and individuals to play around with and when or if they decide to go and pursue this for a career they can up sell them to Logic or Pro Tools etc. This is a huge step up for that route, but what I am saying is this: If everyone has the same tools then how can it be called a pro app? The new FCP is pretty much based on Imovie and for those who dont except that try and use them both together and then you will see.
Take the Red camera.. this could sell for 5k and everyone would have one, so why would you pay a daily rate of $1500 to have someone use a camera that only costs $5k? Wake up and smell the coffee but as your post indicates you dont live in the real world as companies will pay more for something they feel is better than it really is. Its simple business logic and psychology. Companies pay a premium for a professional using professional gear not an app you download from the app store.
I think that is 'Professional' world that your living in is starting to change. Applications aren't just a forte of a few high and mighty code monkeys. For example I could go get Xcode off the App store and download it for 5 bucks, thats all you need to make a Killer iPhone app, 5 bucks. Angry Birds, made millions of dollers, and it started with 5 bucks. It could be used by a Fortune 500 company to create a in-shop app that can do much for the company, or it can be used by some kid in his room to create a game. This idea of there is a special elite out there is changing. Technology is embraced by everyone, and everyone born today will have the same oppertunity's to use them. Computers or Video Editing isn't just something that is done by geeks in a basement on some College campus using machines the size of desks. Its done by Granny's, Kiddo's, everyone. High Definition cameras are affordable to anyone with a little skill in saving. People aren't gonna need 'Professionals' forever. Why hire a Photographer for a wedding, when I can afford just as good Camera, photo editing software for less then it would be to hire them?
We can't keep professionals around just for the sake of keeping them around. If they are productive, if society needs them, then they will do fine. I'm sure your industry needs you, and plenty of regular joe's do to. But not forever, definatly not with the next generation of script kiddies and technology savvy teens.