bugfaceuk
Apr 9, 09:31 AM
Wirelessly posted (Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; en-us) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5)
I love how people are comparing an iOS device with a PS3 or Xbox..
Classic Chalk or Pen post.
I understand your point, but think the fact that they are says a lot about how gaming has changed over the last 4 years.
I love how people are comparing an iOS device with a PS3 or Xbox..
Classic Chalk or Pen post.
I understand your point, but think the fact that they are says a lot about how gaming has changed over the last 4 years.
awmazz
Mar 13, 11:24 AM
I'm all for nuclear power. It's the cleanest
I guess it depends on your perspective of 'clean'. Yellowcake mining is one of the filthiest ugliest long-term polluting human endeavours ever invented. We have three uranium mines:
The Olympic Dam mine owned by BHP Billiton in Roxby Downs here has so far produced over 60 MILLION TONNES of polluting radioactive tailings waste in just 23 years of operation. BHP plans a $5 billion expansion of this single mine. Not more mines, just this one, a whopping $5 billion to expand just one mine. It's very profitable and will become more so as reserves deplete. People in the northern hemisphere are prepared to pay handsomely to shat their energy pollution in other peoples' yards instead of their own.
And then you have the other arseholes owners at the Beverly Mine going by the name of General Atomics who insist on using the ever so lovely even filthier acid-method known as 'in-situ leaching' mining technique, basically because they don't give a flying feck. Their radioactive particles, heavy metals and the acid used to separate the uranium is simply dumped into an aquifier and leaches into our groundwater. No commercial acid leach mine in the USA has ever been given environmental approval, yet here is an American company insisting on using it here as if our environment is their shareholders' own private toilet and spittoon.
The third mine owned by Rio Tinto has just been one environmental or health and safety breach after another. Even to their own workers, exposed to process water 400x maximum Aust safety standards in 2004. Then there was the 2 MILLION LITRES of tailings containing high levels of manganese, uranium and radium which leaked from a pipe. Then there was the contaminated water containing high uranium cocentrations released into the Coonjimba and Magela Creeks.
Depite having over one fifth of the world's reserves and the growing profitibility of yellowcake to the economy, the Australian govt has limited yellowcake mining to the three existing mines. Because it's just too damn filthy and polluting to open new ones.
Cleanest? Coal mining is much cleaner. Why should you consider there's a whole production line of pollution to get that 'clean' energy into your home, not just the painted white-for-purity nuclear power plant at the end.
I guess it depends on your perspective of 'clean'. Yellowcake mining is one of the filthiest ugliest long-term polluting human endeavours ever invented. We have three uranium mines:
The Olympic Dam mine owned by BHP Billiton in Roxby Downs here has so far produced over 60 MILLION TONNES of polluting radioactive tailings waste in just 23 years of operation. BHP plans a $5 billion expansion of this single mine. Not more mines, just this one, a whopping $5 billion to expand just one mine. It's very profitable and will become more so as reserves deplete. People in the northern hemisphere are prepared to pay handsomely to shat their energy pollution in other peoples' yards instead of their own.
And then you have the other arseholes owners at the Beverly Mine going by the name of General Atomics who insist on using the ever so lovely even filthier acid-method known as 'in-situ leaching' mining technique, basically because they don't give a flying feck. Their radioactive particles, heavy metals and the acid used to separate the uranium is simply dumped into an aquifier and leaches into our groundwater. No commercial acid leach mine in the USA has ever been given environmental approval, yet here is an American company insisting on using it here as if our environment is their shareholders' own private toilet and spittoon.
The third mine owned by Rio Tinto has just been one environmental or health and safety breach after another. Even to their own workers, exposed to process water 400x maximum Aust safety standards in 2004. Then there was the 2 MILLION LITRES of tailings containing high levels of manganese, uranium and radium which leaked from a pipe. Then there was the contaminated water containing high uranium cocentrations released into the Coonjimba and Magela Creeks.
Depite having over one fifth of the world's reserves and the growing profitibility of yellowcake to the economy, the Australian govt has limited yellowcake mining to the three existing mines. Because it's just too damn filthy and polluting to open new ones.
Cleanest? Coal mining is much cleaner. Why should you consider there's a whole production line of pollution to get that 'clean' energy into your home, not just the painted white-for-purity nuclear power plant at the end.
1town
Apr 28, 07:58 AM
Horrible headline.
You do not "slip" upwards.
You do not "slip" upwards.
milo
Apr 13, 11:13 AM
I think that most of them will find that Apple has, at present abandoned them.
Based on what? An assumption that Color is gone, based on...what?
But for Broadcast TV, it's a real step down in a lot of ways-- at the very least not a step up.. The interface is very iMovie.
Beyond the interface, how specifically is it a step down? What features have been removed?
...especially if they're getting rid of the rest of the FCS apps..
And is there any reason to believe they are getting rid of them, beyond jumping to conclusions?
I AM a full time film editor and I'm very disappointed by the imovie-esque move. There were a slew of features that REAL editors have been asking for for YEARS (better media management, better multi-user shared projects, and (FOR GOD'S SAKE) better trimming ability. Apple said "nah, f that" and just made iMovie with many of FCP's pro features.
From today's announcement, how do you know none of those new features are in there?
Bring on Logic X for said price and on the App store.
I'd be surprised to see Logic's 40 gigs of download on the app store, but who knows. How big was the last version of FCS?
I very much hope they are coming out with boxed version with printed manuals. Downloading pro apps or suit of pro apps from App Store without physical media or real manuals makes no sense.
Printed manuals? Seriously? What do you do, sit and read manuals on the toilet? Digital manuals are just as "real" and arguably better since it's easy to do text searches and find what you need quicker.
What are the chances that Logic X will be released around the same time?
From what I hear, not likely at all. At least if STP is updated along with FCP I hope it's available somehow to Logic users.
Based on what? An assumption that Color is gone, based on...what?
But for Broadcast TV, it's a real step down in a lot of ways-- at the very least not a step up.. The interface is very iMovie.
Beyond the interface, how specifically is it a step down? What features have been removed?
...especially if they're getting rid of the rest of the FCS apps..
And is there any reason to believe they are getting rid of them, beyond jumping to conclusions?
I AM a full time film editor and I'm very disappointed by the imovie-esque move. There were a slew of features that REAL editors have been asking for for YEARS (better media management, better multi-user shared projects, and (FOR GOD'S SAKE) better trimming ability. Apple said "nah, f that" and just made iMovie with many of FCP's pro features.
From today's announcement, how do you know none of those new features are in there?
Bring on Logic X for said price and on the App store.
I'd be surprised to see Logic's 40 gigs of download on the app store, but who knows. How big was the last version of FCS?
I very much hope they are coming out with boxed version with printed manuals. Downloading pro apps or suit of pro apps from App Store without physical media or real manuals makes no sense.
Printed manuals? Seriously? What do you do, sit and read manuals on the toilet? Digital manuals are just as "real" and arguably better since it's easy to do text searches and find what you need quicker.
What are the chances that Logic X will be released around the same time?
From what I hear, not likely at all. At least if STP is updated along with FCP I hope it's available somehow to Logic users.
RedReplicant
Apr 5, 05:31 PM
One thing that got me was that you cannot make apps fill the screen without dragging and resizing. You can only resize from the bottom right corner. No real other annoyances for me that I can think of.
SizeUp is awesome for this, as well as tiling applications on the screen.
http://irradiatedsoftware.com/sizeup/
SizeUp is awesome for this, as well as tiling applications on the screen.
http://irradiatedsoftware.com/sizeup/
munkery
May 3, 12:15 AM
Yes, and that prevents AntiVirus 2010 from successfully collecting credit card info too.
Check out this quote about the latest variant of that Windows malware called Antivirus 2011.
You're blocked from executing anything else, including trying to run your real anti-virus program.
This virus program renders your entire computer useless until you can get it removed. And some of its many variants are becoming immune to existing removal tools.
From here, http://detnews.com/article/20110502/BIZ04/105020317/1013/rss12
BTW, it renders Windows useless by corrupting the registry. No registry in OS X.
Luckily, this type of malware on a Mac is not nearly as bad if your clumsy enough to get infected. You can even remove it from the account that is infected without having to boot into a safe mode.
This post made me have to edit a previous post. Thought I should quote it,
Problems with Windows security in comparison to Mac OS X presented just in this thread:
1) Greater number of privilege escalation vulnerabilities:
Here is a list of privilege escalation (UAC bypass) vulnerabilities just related to Stuxnet (win32k.sys) in Windows in 2011:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=win32k.sys+2011
Here is a list of all of the privilege escalation vulnerabilities in Mac OS X in 2011:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=Mac+OS+X+privileges+2011
2) Earlier versions of NT based Windows (Windows XP and earlier) do not use discretionary access controls by default.
3) Permissions system does not include a user defined unique identifier (password) by default. More susceptible to user space exploitation leading to authentication stolen via spoofed prompt that appears unrelated to UAC because password not associated with authentication.
4) Windows sandbox mechanism relies on inherited permissions so that turning off UAC turns off the sandbox. This sandbox has been defeated in the wild (in the last two pwn2owns).
I do not know of any TrustedBSD MAC framework (BSD and Mac sandbox), AppArmor (openSUSE and Ubuntu), or SE Linux (Fedora) mandatory access control escapes? These sandbox mechanisms do not rely on inherited permissions.
5) The Windows registry is a single point of failure that can be leveraged by malware.
Check out this quote about the latest variant of that Windows malware called Antivirus 2011.
You're blocked from executing anything else, including trying to run your real anti-virus program.
This virus program renders your entire computer useless until you can get it removed. And some of its many variants are becoming immune to existing removal tools.
From here, http://detnews.com/article/20110502/BIZ04/105020317/1013/rss12
BTW, it renders Windows useless by corrupting the registry. No registry in OS X.
Luckily, this type of malware on a Mac is not nearly as bad if your clumsy enough to get infected. You can even remove it from the account that is infected without having to boot into a safe mode.
This post made me have to edit a previous post. Thought I should quote it,
Problems with Windows security in comparison to Mac OS X presented just in this thread:
1) Greater number of privilege escalation vulnerabilities:
Here is a list of privilege escalation (UAC bypass) vulnerabilities just related to Stuxnet (win32k.sys) in Windows in 2011:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=win32k.sys+2011
Here is a list of all of the privilege escalation vulnerabilities in Mac OS X in 2011:
http://cve.mitre.org/cgi-bin/cvekey.cgi?keyword=Mac+OS+X+privileges+2011
2) Earlier versions of NT based Windows (Windows XP and earlier) do not use discretionary access controls by default.
3) Permissions system does not include a user defined unique identifier (password) by default. More susceptible to user space exploitation leading to authentication stolen via spoofed prompt that appears unrelated to UAC because password not associated with authentication.
4) Windows sandbox mechanism relies on inherited permissions so that turning off UAC turns off the sandbox. This sandbox has been defeated in the wild (in the last two pwn2owns).
I do not know of any TrustedBSD MAC framework (BSD and Mac sandbox), AppArmor (openSUSE and Ubuntu), or SE Linux (Fedora) mandatory access control escapes? These sandbox mechanisms do not rely on inherited permissions.
5) The Windows registry is a single point of failure that can be leveraged by malware.
mcmarks
May 2, 12:19 PM
A couple of points:
- No computer for which the user can write or install programs will ever be free of Malware (nor, to my knowledge, has the "malware free" term ever been applied to the Mac OS by anyone actually familiar with computer security). All I have to do is write a script that formats your hard drive, call it ReallyFunGame, thereby deceiving you into downloading it and running it, and poof. Malware at its most basic. (Apple addresses this issue with the App Store reviews for iOS apps, but even there, their review is not sufficient to eliminate all possibility of malware). So, the actual presence of malware is no surprise, nor has it ever been. The defense against these types of attacks are user education and OS design (which will be a compromise between usability and security). Personally, I find the compromises on the Mac less annoying than their counterparts on Windows. Furthermore, the frequent inscrutable dialogs on Windows in general cause a certain level of desensitization to all dialogs for the least savvy users undermining their value on Windows because users get used to just clicking through things they don't understand.
- The far more dangerous computer security problem, as has been mentioned in this thread a bit, is viruses (including worms which are a subset) because they can propagate and cause harm without user knowledge and intervention. This new piece of malware is not one of those (as far as I can tell). To my knowledge, Mac OS X remains a more secure operating system because there are no known viruses that have propagated in the wild that attack it. Now, if the same can be said for Windows 7 (I don't know whether it can or not), then it would be equally secure. Is it?
- No computer for which the user can write or install programs will ever be free of Malware (nor, to my knowledge, has the "malware free" term ever been applied to the Mac OS by anyone actually familiar with computer security). All I have to do is write a script that formats your hard drive, call it ReallyFunGame, thereby deceiving you into downloading it and running it, and poof. Malware at its most basic. (Apple addresses this issue with the App Store reviews for iOS apps, but even there, their review is not sufficient to eliminate all possibility of malware). So, the actual presence of malware is no surprise, nor has it ever been. The defense against these types of attacks are user education and OS design (which will be a compromise between usability and security). Personally, I find the compromises on the Mac less annoying than their counterparts on Windows. Furthermore, the frequent inscrutable dialogs on Windows in general cause a certain level of desensitization to all dialogs for the least savvy users undermining their value on Windows because users get used to just clicking through things they don't understand.
- The far more dangerous computer security problem, as has been mentioned in this thread a bit, is viruses (including worms which are a subset) because they can propagate and cause harm without user knowledge and intervention. This new piece of malware is not one of those (as far as I can tell). To my knowledge, Mac OS X remains a more secure operating system because there are no known viruses that have propagated in the wild that attack it. Now, if the same can be said for Windows 7 (I don't know whether it can or not), then it would be equally secure. Is it?
sinsin07
Apr 9, 03:36 AM
Wait, why is FFII and FFIII more a mind numbing time killer over any other game (I am getting FFIII either when it goes on a good sale or I finally finish up my other games, whichever comes first)? Or Myst or Riven for that matter (both on my phone, I've beaten Myst but haven't started Riven).
Britain#39;s Prince William and
prince-william-kate-middleton-
prince william kate engagement
william and kate engagement
Eaon
Apr 19, 02:12 PM
Also mac networking sucks, pc,s rarely show in finder, sometimes do sometimes dont, have to cmd k far too often, well in my experience anyway.
I don't think that's so much the Mac's fault as it is the general design of Windows networking in the Workgroup configuration that Apple continues to have to rely on to talk to Windows systems.
Windows in a workgroup mode uses a method of "broadcast my presence on the network" that you might think is like what Bonjour does for pure Mac networks, but it's of a Windows 95 vintage. Try setting up a pure Windows network using workgroups, not Active Directory, and watch how it can take around 20 minutes for systems to start showing up in each other's network neighbourhoods. It's lame. I know in Vista or 7 Microsoft added a new "homegroup" system, not sure if that's any better.
I guess you could complain that Apple should try to get up to speed on the homegroup thing, but it's not like Microsoft is overly forthcoming with their specs for their networking. Maybe if the rumours of Apple ditching Samba for something built in-house are true, maybe that means they've licensed tech from Microsoft to make this work better, but I wouldn't hold my breath.
From my own personal experience, I bring my MBP in to work and plug it in to the AD-based network, and system names start filling up my sidebar faster than I can get the mouse over there to close the Sharing section so I don't have to see them all. :cool:
I don't think that's so much the Mac's fault as it is the general design of Windows networking in the Workgroup configuration that Apple continues to have to rely on to talk to Windows systems.
Windows in a workgroup mode uses a method of "broadcast my presence on the network" that you might think is like what Bonjour does for pure Mac networks, but it's of a Windows 95 vintage. Try setting up a pure Windows network using workgroups, not Active Directory, and watch how it can take around 20 minutes for systems to start showing up in each other's network neighbourhoods. It's lame. I know in Vista or 7 Microsoft added a new "homegroup" system, not sure if that's any better.
I guess you could complain that Apple should try to get up to speed on the homegroup thing, but it's not like Microsoft is overly forthcoming with their specs for their networking. Maybe if the rumours of Apple ditching Samba for something built in-house are true, maybe that means they've licensed tech from Microsoft to make this work better, but I wouldn't hold my breath.
From my own personal experience, I bring my MBP in to work and plug it in to the AD-based network, and system names start filling up my sidebar faster than I can get the mouse over there to close the Sharing section so I don't have to see them all. :cool:
Gelfin
Apr 24, 03:03 PM
In answer to the OP's question, I have long harbored the suspicion (without any clear idea how to test it) that human beings have evolved their penchant for accepting nonsense. On the face of it, accepting that which does not correspond with reality is a very costly behavior. Animals that believe they need to sacrifice part of their food supply should be that much less likely to survive than those without that belief.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
My hunch, however, is that the willingness to play along with certain kinds of nonsense games, including religion and other ritualized activities, is a social bonding mechanism in humans so deeply ingrained that it is difficult for us to step outside ourselves and recognize it for a game. One's willingness to play along with the rituals of a culture signifies that his need to be a part of the community is stronger than his need for rational justification. Consenting to accept a manufactured truth is an act of submission. It generates social cohesion and establishes shibboleths. In a way it is a constant background radiation of codependence and enablement permeating human existence.
If I go way too far out on this particular limb, I actually suspect that the ability to prioritize rational justification over social submission is a more recent development than we realize, and that this development is still competing with the old instincts for social cohesion. Perhaps this is the reason that atheists and skeptics are typically considered more objectionable than those with differing religious or supernatural beliefs. Playing the game under slightly different rules seems less dangerous than refusing to play at all.
Think of the undertones of the intuitive stereotype many people have of skeptics: many people automatically imagine a sort of bristly, unfriendly loner who isn't really happy and is always trying to make other people unhappy too. There is really no factual basis for this caricature, and yet it is almost universal. On this account, when we become adults we do not stop playing games of make-believe. Instead we just start taking our games of make-believe very seriously, and our intuitive sense is that someone who rejects our games is rejecting us. Such a person feels untrustworthy in a way we would find hard to justify.
Religions are hardly the only source of this sort of game. I suspect they are everywhere, often too subtle to notice, but religions are by far the largest, oldest, most obtrusive example.
Funkymonk
Apr 20, 07:18 PM
I like apple products better, MUCH BETTER. I still don't get how people say android is laggy and constantly crashes. I've used both the galaxy s and the desire hd and I thought they were incredibly smooth and responsive. I like my iphone better though lol.
totoum
Apr 13, 02:32 AM
Oh but it will sync the sound for you
Right,because wasting time syncing audio manually when you could be doing actual editing is what makes someone a pro.
william kate engagement. will
william and kate engagement
Prince William and Kate
william and kate engagement
prince-william-kate-middleton-
william and kate engagement
william and kate engagement
Right,because wasting time syncing audio manually when you could be doing actual editing is what makes someone a pro.
ohio.emt
May 5, 12:02 PM
I haven't had any dropped calls yet. I think the problem is more the iPhone, than AT&T's network . If I drive out of 3G service my iPhone drops service and says no service on it, doesn't revert to the Edge network most times. I have to turn 3G off or turn airplane mode of then on to get service on Edge. IMHO apple needs to fix the software in order to make the switch to and from Edge and 3G like other phone, no drop in service it just switches over. Sitting at home if I turn 3G on I get 3G signal and speed with 4 bars, but after about 5 minutes it switches to Edge. Any other phone besides the iPhone stays on 3G.
javajedi
Oct 11, 06:30 PM
Originally posted by javajedi
What you are saying makes a lot of sense. Now that I think about, I too recall reading this somewhere.
Now that we know the real truth about the "better standard FPU", I thought it was time to shed some light on non vectorized G4 integer processing.
It still does 200,000,000 calculations, but this time I'm multiplying ints.
Motorola 7455 G4@800Mhz: 9 seconds (Native)
IBM 750FX G3@700Mhz: 7 seconds (Native)
Intel P4@2600Mhz 2 seconds (Java)
PowerPC 7455 integer processing is consierabley better than floating point (obviously less work doing ints), but still less per cycle than the Pentium 4.
Very intresting the G4 looses both floating point and integer to the IBM chip, at a 100MHz clock disadvantage.
I'm still waiting to see that "better standard FPU" in the G4. It seems the G4 is absolutely useless unless you are fortunate to have vectorized (AltiVec) code.
Alex, yeah, the native version was compiled under 3.1. It really is interesting to note that despite the 750FX's 100MHz clock disadvantage, it is able to outperform it by 22%. Since there is a 13% difference in clock speed, and if clocks were equal, the 750FX is technically 25% more efficient in scalar integer. I should also re-emphasize that I never bothered compiling the test natively for x86, I left it java, so it's not out of the question the P4 could do this in 1 second - and that is *NOT* using any vector libraries, just plain old integer math.
I've found some documentation on the Altivec C programming interface, and this weekend I'm going to make a first attempt at vectorizing it. The integer test should be no problem, but my FPMathTest app that did square roots will be more difficult. With Altivec, there is not recognized double precision floating point, so this complicates doing square roots. If you want more accurate, precision square roots, you have to do Newton Raphson refinement. In other words more ************ you have to go through. I believe in SSE2 you have double precision floating point ops, and if you were to vectorize it, you wouldn't have to compensate for this.
Another theory as to why the P4 is scoring so good is because if I'm not mistaking (and I'm not), the P4's ALU runs at double its clock. So in my case, 5.6GHz. I'm sure this relates to the issue.
I don't know how true this is, but I wouldn't be suprised if there is some truth to it, surely some food for thought:
http://www.osopinion.com/perl/story/17368.html
The G4 was just a hacked-up G3 with AltiVec and an FPU (floating point unit) borrowed from the outdated 604
If this is the case, then no wonder why we are getting these abysmal scores, and no wonder why a 400mhz Celeron can nearly equal it, and no wonder why the 750FX can outperform it (different company, different fpu)
What you are saying makes a lot of sense. Now that I think about, I too recall reading this somewhere.
Now that we know the real truth about the "better standard FPU", I thought it was time to shed some light on non vectorized G4 integer processing.
It still does 200,000,000 calculations, but this time I'm multiplying ints.
Motorola 7455 G4@800Mhz: 9 seconds (Native)
IBM 750FX G3@700Mhz: 7 seconds (Native)
Intel P4@2600Mhz 2 seconds (Java)
PowerPC 7455 integer processing is consierabley better than floating point (obviously less work doing ints), but still less per cycle than the Pentium 4.
Very intresting the G4 looses both floating point and integer to the IBM chip, at a 100MHz clock disadvantage.
I'm still waiting to see that "better standard FPU" in the G4. It seems the G4 is absolutely useless unless you are fortunate to have vectorized (AltiVec) code.
Alex, yeah, the native version was compiled under 3.1. It really is interesting to note that despite the 750FX's 100MHz clock disadvantage, it is able to outperform it by 22%. Since there is a 13% difference in clock speed, and if clocks were equal, the 750FX is technically 25% more efficient in scalar integer. I should also re-emphasize that I never bothered compiling the test natively for x86, I left it java, so it's not out of the question the P4 could do this in 1 second - and that is *NOT* using any vector libraries, just plain old integer math.
I've found some documentation on the Altivec C programming interface, and this weekend I'm going to make a first attempt at vectorizing it. The integer test should be no problem, but my FPMathTest app that did square roots will be more difficult. With Altivec, there is not recognized double precision floating point, so this complicates doing square roots. If you want more accurate, precision square roots, you have to do Newton Raphson refinement. In other words more ************ you have to go through. I believe in SSE2 you have double precision floating point ops, and if you were to vectorize it, you wouldn't have to compensate for this.
Another theory as to why the P4 is scoring so good is because if I'm not mistaking (and I'm not), the P4's ALU runs at double its clock. So in my case, 5.6GHz. I'm sure this relates to the issue.
I don't know how true this is, but I wouldn't be suprised if there is some truth to it, surely some food for thought:
http://www.osopinion.com/perl/story/17368.html
The G4 was just a hacked-up G3 with AltiVec and an FPU (floating point unit) borrowed from the outdated 604
If this is the case, then no wonder why we are getting these abysmal scores, and no wonder why a 400mhz Celeron can nearly equal it, and no wonder why the 750FX can outperform it (different company, different fpu)
iMikeT
Aug 29, 11:10 AM
?tree-huggers? ?interfere with business? !we don't want to start that discussion!
Do you have proof for your statement, that Apple is doing their best?
Apple has released a statement regarding the findings and it is just as realiable as Greenpeace's.
Besides, I said that Apple is doing what they can.
Do you have proof for your statement, that Apple is doing their best?
Apple has released a statement regarding the findings and it is just as realiable as Greenpeace's.
Besides, I said that Apple is doing what they can.
ddtlm
Oct 10, 01:10 PM
alex_ant:
Great to see you fighting the good fight!
others:
As true as it is that the G4 is slower than most of its compeditiors, when it is performing as bad as the numbers that some people have posted here then I can just about assure you that the Mac is at a severe software disadvantage. I mean really, look at the specs of a G4, the worst case performance delta between it and a top-of-the-line PC should be maybe 4x or 5x, not these 10x and higher numbers. There are very few situations when a G4 should do less work per clock than a P4.
So lets try to remain realistic here. It is virtually gaurenteed that the actual performance potential of a 1.25ghz G4 falls between that of a 1.3ghz P4 and the 2.8ghz P4.
EDIT:
Almost forgot to talk about SPEC. Some time ago, the only SPEC results that I know of for Macs were obtained by c't:
http://www.heise.de/ct/english/02/05/182/
In these they showed the G4 was more or less the same speed as a P3 of equal clock (1.0ghz) in the integer tests, when both where done done with GCC. Intel's compiler can give the P3 at 30% edge or something, so we know that the quality of compiler is hurting the G4 here. It is not fair to look at SPEC and declare other chips to be a zillion times faster than the G4, simply because they are all using very good compilers whereas Apple is stuck with GCC. Apple is working to improve GCC however, so things may get better.
(In SPEC FP the G4 get beat worse, I might add. Compilers played a role for sure, but can't explain the whole loss.)
Great to see you fighting the good fight!
others:
As true as it is that the G4 is slower than most of its compeditiors, when it is performing as bad as the numbers that some people have posted here then I can just about assure you that the Mac is at a severe software disadvantage. I mean really, look at the specs of a G4, the worst case performance delta between it and a top-of-the-line PC should be maybe 4x or 5x, not these 10x and higher numbers. There are very few situations when a G4 should do less work per clock than a P4.
So lets try to remain realistic here. It is virtually gaurenteed that the actual performance potential of a 1.25ghz G4 falls between that of a 1.3ghz P4 and the 2.8ghz P4.
EDIT:
Almost forgot to talk about SPEC. Some time ago, the only SPEC results that I know of for Macs were obtained by c't:
http://www.heise.de/ct/english/02/05/182/
In these they showed the G4 was more or less the same speed as a P3 of equal clock (1.0ghz) in the integer tests, when both where done done with GCC. Intel's compiler can give the P3 at 30% edge or something, so we know that the quality of compiler is hurting the G4 here. It is not fair to look at SPEC and declare other chips to be a zillion times faster than the G4, simply because they are all using very good compilers whereas Apple is stuck with GCC. Apple is working to improve GCC however, so things may get better.
(In SPEC FP the G4 get beat worse, I might add. Compilers played a role for sure, but can't explain the whole loss.)
Multimedia
Oct 27, 12:37 AM
Multimedia, I was wondering if you could address the FSB issue being discussed by a few people here, namely how more and more cores using the same FSB per chip can push only so much data through that 1333 MHZ pipe, thereby making the FSB act as a bottleneck. Any thoughts?No thoughts. Hope for the best.
ender land
Apr 23, 09:45 PM
Yes there are. In theistic belief.
You do not think it takes any faith to say that NO God exists? Or that NO supernatural power exists? That you can 100% prove a lack of God?
Google Christian forums (http://www.google.com/search?hl=en&safe=off&qscrl=1&q=christian+forums&aq=0&aqi=g10&aql=&oq=christian+foru).
Then tell them that they're not true believers.
Oh please. If you even bothered to read any of the descriptions of those sites you would find the majority of them are faith based to begin with. There is a huge difference pointless discussion for the sake of argument and forums dedicated to learning about how to better implement one's faith, learn about it, pray for each other, etc.
You do not think it takes any faith to say that NO God exists? Or that NO supernatural power exists? That you can 100% prove a lack of God?
Google Christian forums (http://www.google.com/search?hl=en&safe=off&qscrl=1&q=christian+forums&aq=0&aqi=g10&aql=&oq=christian+foru).
Then tell them that they're not true believers.
Oh please. If you even bothered to read any of the descriptions of those sites you would find the majority of them are faith based to begin with. There is a huge difference pointless discussion for the sake of argument and forums dedicated to learning about how to better implement one's faith, learn about it, pray for each other, etc.
emotion
Sep 20, 08:36 AM
It looks like a Mini and and i can do exactly the same with the current Mini. Hook up a Mini to a TV and add it to a home network, let it be cabled or wireless. With the frontrow software you can now listen and watch all the content from the other computers in the network with iTunes streaming.
The only differences between a Mini and iTV are the connections on the back, better wireless speed and no DVD. Its pure the price and software that makes it a media device and not a computer.
I can do what an iPod does with my Powerbook too. Doesn't mean I want to use that to play music when I'm walking around.
Likewise, I want a computer at close distance hooked up to a computer monitor, it's less than ideal sat under my TV displaying on a relatively low res screen with a keyboard and mouse teetering on my lap. I know Apple think this too.
With the iTV as I see it you get to have that Mini being a real computer somewhere else in your house.
That said, I could be wrong and it could be a really cut down Mac Mini. I guess we'll see.
The only differences between a Mini and iTV are the connections on the back, better wireless speed and no DVD. Its pure the price and software that makes it a media device and not a computer.
I can do what an iPod does with my Powerbook too. Doesn't mean I want to use that to play music when I'm walking around.
Likewise, I want a computer at close distance hooked up to a computer monitor, it's less than ideal sat under my TV displaying on a relatively low res screen with a keyboard and mouse teetering on my lap. I know Apple think this too.
With the iTV as I see it you get to have that Mini being a real computer somewhere else in your house.
That said, I could be wrong and it could be a really cut down Mac Mini. I guess we'll see.
ct2k7
Apr 24, 05:29 PM
you say it only applies to muslims yet the victims in blasphemy cases in pakistan, for example, are mostly christians.
[quote]
If you've been reading, when applied correctly, it only applies to Muslims
The "war" against islam that you speak of is being encouraged by imams, and at saudi funded madrassas in the UK and beyond.
Fundamentalists who have taken an extreme point of view. Are you saying that Islam is not allowed any extremists? All religions have then. But not Muslims are extremists.
in the US more hate crimes were perpetrated against jews in 2010 than any other group. hate crimes against muslims had gone down in 2010. so, i guess the islamophobia is really poisonous and rampant...
interestingly, as the muslim population increases so too do reported cases of anti-semitic hate crimes.
I could see this coming. We don't all live in the US. Reported rates go down, but it also works psychologically.
If I even dare comment on the last thing, the thread topic will change.
[quote]
If you've been reading, when applied correctly, it only applies to Muslims
The "war" against islam that you speak of is being encouraged by imams, and at saudi funded madrassas in the UK and beyond.
Fundamentalists who have taken an extreme point of view. Are you saying that Islam is not allowed any extremists? All religions have then. But not Muslims are extremists.
in the US more hate crimes were perpetrated against jews in 2010 than any other group. hate crimes against muslims had gone down in 2010. so, i guess the islamophobia is really poisonous and rampant...
interestingly, as the muslim population increases so too do reported cases of anti-semitic hate crimes.
I could see this coming. We don't all live in the US. Reported rates go down, but it also works psychologically.
If I even dare comment on the last thing, the thread topic will change.
Evangelion
Jul 12, 05:05 AM
Er...have you seen the MacBook Pro pricing? The MacBook pricing? The iMac pricing? The Mini pricing? (Which went UP by a fair amount). If you're thinking that x86 processors are cheaper than PPC, you're sadly mistaken. Cheap computers being cheap has just about nothing whatsoever to do with the CPU....
--Eric
Well, the Mini got more expensive, but it's capabilities went WAY up. Optical audio in and out, twice the USB-ports (fixing the two biggest complaints about the old Mini), built-in wireless, about twice as fast CPU (hell, the new low-end is propably over 50% faster than the old hi-end!) and Core Image compliant video.
Comparing price and capabilities, The Mini just got a whole lot cheaper :). The low-end Mini costs the same as the old hi-end Mini, but the new low-end Mini is a lot better than the old hi-end Mini.
--Eric
Well, the Mini got more expensive, but it's capabilities went WAY up. Optical audio in and out, twice the USB-ports (fixing the two biggest complaints about the old Mini), built-in wireless, about twice as fast CPU (hell, the new low-end is propably over 50% faster than the old hi-end!) and Core Image compliant video.
Comparing price and capabilities, The Mini just got a whole lot cheaper :). The low-end Mini costs the same as the old hi-end Mini, but the new low-end Mini is a lot better than the old hi-end Mini.
pdjudd
Oct 7, 04:57 PM
Have you actually READ the link you posted?
Times have changed a bit since then, you know ...
Yes, I have. Several times. Things have changed, but the base premise of the article still applies - Microsoft Got Lucky - there is no way to suggest that Apple can pull that off in this day in age when the world depends too much on Microsoft. The article deals with past actions affecting the present. Its very relevant. Its point is that MS got successful because of how it parlayed successes over time, not because it embraced an "open strategy". They did that years ago. Read the whole thing. Grueber makes a point that still applies today because marketshare in the OS world has changed very little.
Due to Apple's grown popularity (if not ubiquity) it can be safely assumed that quite a few more people would install Mac OS if it were officially supported on non-Mac hardware. A highly significant number of people? Good question. To Apple's benefit? Probably not.
Popularity is irrelevant. Going up against Microsoft is suicide. Period. Their market share is too large and Apple's success is too dependent on hardware sales. Microsoft's objective is to rule the roost. They did that way back in the early 90's and they are too well entrenched to be taken out directly. They are just too big. You are simply conjecturing without any basis in reality. Apple tried the cloning market and it failed because people by in large do not want to undertake the massive pains to go to a completely different platform without somewhat of a safety platform. People want Windows because the stuff they run on depend on it. Thant and competing with Microsoft directly is a folly - going up against MS is going to be very bloody. You have better luck elephant hunting with a pea shooter.
Take a look at any other market that involves hardware and software. The article makes a good point about video games. They are totally incompatible with each other and are very closed systems. They remain successful because they can take one success and transition it to another - like the Mario franchise. MS did the same thing with computers years ago (with the objective of being really lucky thanks to boneheaded decisions by IBM). Apple did not. Of course Apple's objectives were far different back then, but Apple operates differently than MS does.
While Apple could get a few more customers, it just wouldn't last. There is no reason to think that it would or that they could sustain it. Its about making a good choice.
You cannot say that Apple's market strategy would gain them more money from copying MS business strategy, you just can't because they aren't the same. You cannot make a flawed assumption and think that Microsoft got achieved success by doing things the way the market was meant to be. They didn't. Microsoft got real lucky and rode on the coat tails of IBM business mentality and got massive market share because of that - way back in the 80's. That's just how things ended up. Doesn't mean that it works that way all the time and there is no reason to suggest that Apple is gonna want to chance it.
At this point in the game Microsoft has won - Jobs has admitted that years ago. Microsoft makes billions from the business market that by in large has no interest in making a risky and expensive change that going to Mac entails. Microsoft provides a very prediction, safe route that has massive industry support. Apple would have needed this kind of success really early on - but back in that day, they were adopting practices that were fundamentally different.
It doesn't matter that Apple's system is better - the lions share of the market made their choice years ago and that market doesn't tolerate direct competition. In Microsoft's world - they are the only game in town. And I say that the reason is that Apple is still around because they don't encroach into Microsoft's big markets. They don't license their software out to Microsoft's partners, they don't sell office software to PC's. There is a reason - Microsoft is far too big.
Times have changed a bit since then, you know ...
Yes, I have. Several times. Things have changed, but the base premise of the article still applies - Microsoft Got Lucky - there is no way to suggest that Apple can pull that off in this day in age when the world depends too much on Microsoft. The article deals with past actions affecting the present. Its very relevant. Its point is that MS got successful because of how it parlayed successes over time, not because it embraced an "open strategy". They did that years ago. Read the whole thing. Grueber makes a point that still applies today because marketshare in the OS world has changed very little.
Due to Apple's grown popularity (if not ubiquity) it can be safely assumed that quite a few more people would install Mac OS if it were officially supported on non-Mac hardware. A highly significant number of people? Good question. To Apple's benefit? Probably not.
Popularity is irrelevant. Going up against Microsoft is suicide. Period. Their market share is too large and Apple's success is too dependent on hardware sales. Microsoft's objective is to rule the roost. They did that way back in the early 90's and they are too well entrenched to be taken out directly. They are just too big. You are simply conjecturing without any basis in reality. Apple tried the cloning market and it failed because people by in large do not want to undertake the massive pains to go to a completely different platform without somewhat of a safety platform. People want Windows because the stuff they run on depend on it. Thant and competing with Microsoft directly is a folly - going up against MS is going to be very bloody. You have better luck elephant hunting with a pea shooter.
Take a look at any other market that involves hardware and software. The article makes a good point about video games. They are totally incompatible with each other and are very closed systems. They remain successful because they can take one success and transition it to another - like the Mario franchise. MS did the same thing with computers years ago (with the objective of being really lucky thanks to boneheaded decisions by IBM). Apple did not. Of course Apple's objectives were far different back then, but Apple operates differently than MS does.
While Apple could get a few more customers, it just wouldn't last. There is no reason to think that it would or that they could sustain it. Its about making a good choice.
You cannot say that Apple's market strategy would gain them more money from copying MS business strategy, you just can't because they aren't the same. You cannot make a flawed assumption and think that Microsoft got achieved success by doing things the way the market was meant to be. They didn't. Microsoft got real lucky and rode on the coat tails of IBM business mentality and got massive market share because of that - way back in the 80's. That's just how things ended up. Doesn't mean that it works that way all the time and there is no reason to suggest that Apple is gonna want to chance it.
At this point in the game Microsoft has won - Jobs has admitted that years ago. Microsoft makes billions from the business market that by in large has no interest in making a risky and expensive change that going to Mac entails. Microsoft provides a very prediction, safe route that has massive industry support. Apple would have needed this kind of success really early on - but back in that day, they were adopting practices that were fundamentally different.
It doesn't matter that Apple's system is better - the lions share of the market made their choice years ago and that market doesn't tolerate direct competition. In Microsoft's world - they are the only game in town. And I say that the reason is that Apple is still around because they don't encroach into Microsoft's big markets. They don't license their software out to Microsoft's partners, they don't sell office software to PC's. There is a reason - Microsoft is far too big.
alex_ant
Oct 10, 12:04 PM
Originally posted by TheFink
Do you have any pics of your closest attempt at an 8 lb turd?
Yes actually!
Do you have any pics of your closest attempt at an 8 lb turd?
Yes actually!
brianus
Sep 26, 12:01 PM
Hey here's a question: what comes after Clovertown? The roadmap is kinda confusing after that from what I've seen. When can we reasonably expect Clovertown's successor, and what will it consist of?
I know there's a new architecture 2 years down the line, a die shrink, some multicore chips that won't be used in a Mac Pro... but can we expect any kind of real upgrade in past Clovertown, beyond mere speed bumps, or will this basically be it until '08?
I know there's a new architecture 2 years down the line, a die shrink, some multicore chips that won't be used in a Mac Pro... but can we expect any kind of real upgrade in past Clovertown, beyond mere speed bumps, or will this basically be it until '08?