Queso
Mar 26, 03:22 PM
He rich, yet he wears the same thing every day?
When he wants to be "accidentally" photographed most definitely.
When he wants to be "accidentally" photographed most definitely.
iHarrison
Jan 6, 03:16 PM
someone explain this to me.....wouldn't activating push notifications absolutely KILL my iPhone 3GS battery life?? I've never been able to figure out if push notifications are bad to turn on cause they drain battery life?
Thanks!
Thanks!
menziep
Sep 25, 09:57 AM
If you are capable of understanding German:
www.mactechnews.de is reporting "live";) :)
Says there is iLife Intergration and Plug Ins
www.mactechnews.de is reporting "live";) :)
Says there is iLife Intergration and Plug Ins
rasmasyean
May 4, 10:56 AM
I don't know. Does the US military usually sell its tech to the Japanese?
Seems to me that it's a technology lots of people are working on in parallel.
Nice example. Frank Whittle (http://inventors.about.com/library/inventors/bljetengine.htm) received the first jet engine patent in 1930. He had been in the Air Force, but they wouldn't sponsor his research - so the development was privately funded and finally demonstrated in 1937.
I think you're confusing fission and fusion.
Darpanet, indeed. But the web itself was developed in peacetime by a man researching at a (non military) Swiss research establishment (http://public.web.cern.ch/public/en/about/web-en.html).
The first commercial transistors were developed for telecoms by AT&T / Texas instruments (http://en.wikipedia.org/wiki/Transistor).
The integrated circuit was invented in peace time, and it's mass production was spurred as much by the Apollo program (http://en.wikipedia.org/wiki/Integrated_circuit) as for defence.
Interestingly, defence and space are very conservative in their use of technology and CPUs. The increase in CPU power over time has clearly been motivated by commercial market forces (non military).
Yes, I don't deny that defence money does finance innovation. But that's not the same as implying that innovation wouldn't take place if it wasn't for War. That's clearly nonsense - there's plenty of civil and commercial market forces that also spur development, and the examples you've cited demonstrate a few. War is not an essential for human or technological development, although it may speed it along a little from time to time.
I don't think you understand the progress of technological advancements. You seem to have this idea that once something is thought of in bed, it's guaranteed to be on an instant bee line to world scale distribution. While it's true that many tech breakthroughs (or ideas) can be implemented rigth away, much of the most out disruptive realizations require huge investestments with no obvious guarantee of a profit.
And there is a distinguishment between nuclear reality and nuclear fantasy (fusion).
http://www.howstuffworks.com/nuclear-power.htm
Bollocks. It is absolutely nothing to do with evolution. Opposed thumbs, brain size, bipedality, toolmaking and speech have had the most influence on our development. As to whether we have evolved past any other species, that, I would have thought, is very much up for debate.
Yea it does. To simply put it, there's no animal in between "us" and the "nearest monkey". They are all fossils. That's because in competition, we killed "our own kind" in the strugle for survival and prosperity. That is...unless you prefer the "man created in the image of some deity" explaination.
Seems to me that it's a technology lots of people are working on in parallel.
Nice example. Frank Whittle (http://inventors.about.com/library/inventors/bljetengine.htm) received the first jet engine patent in 1930. He had been in the Air Force, but they wouldn't sponsor his research - so the development was privately funded and finally demonstrated in 1937.
I think you're confusing fission and fusion.
Darpanet, indeed. But the web itself was developed in peacetime by a man researching at a (non military) Swiss research establishment (http://public.web.cern.ch/public/en/about/web-en.html).
The first commercial transistors were developed for telecoms by AT&T / Texas instruments (http://en.wikipedia.org/wiki/Transistor).
The integrated circuit was invented in peace time, and it's mass production was spurred as much by the Apollo program (http://en.wikipedia.org/wiki/Integrated_circuit) as for defence.
Interestingly, defence and space are very conservative in their use of technology and CPUs. The increase in CPU power over time has clearly been motivated by commercial market forces (non military).
Yes, I don't deny that defence money does finance innovation. But that's not the same as implying that innovation wouldn't take place if it wasn't for War. That's clearly nonsense - there's plenty of civil and commercial market forces that also spur development, and the examples you've cited demonstrate a few. War is not an essential for human or technological development, although it may speed it along a little from time to time.
I don't think you understand the progress of technological advancements. You seem to have this idea that once something is thought of in bed, it's guaranteed to be on an instant bee line to world scale distribution. While it's true that many tech breakthroughs (or ideas) can be implemented rigth away, much of the most out disruptive realizations require huge investestments with no obvious guarantee of a profit.
And there is a distinguishment between nuclear reality and nuclear fantasy (fusion).
http://www.howstuffworks.com/nuclear-power.htm
Bollocks. It is absolutely nothing to do with evolution. Opposed thumbs, brain size, bipedality, toolmaking and speech have had the most influence on our development. As to whether we have evolved past any other species, that, I would have thought, is very much up for debate.
Yea it does. To simply put it, there's no animal in between "us" and the "nearest monkey". They are all fossils. That's because in competition, we killed "our own kind" in the strugle for survival and prosperity. That is...unless you prefer the "man created in the image of some deity" explaination.
more...
cvaldes
Oct 6, 05:52 PM
It's a Shaw Wu rumor, so it must be poppycock.
Sorry, folks. Nothing here to see. Move along.
:D
Sorry, folks. Nothing here to see. Move along.
:D
robanga
Mar 25, 09:24 AM
If there are any questions from children on what careers they should gravitate toward,
Intellectual property attorney should be high on the list. Its replaced sales as a way for company's to gain revenue. Just take the results of other company's sales. No matter that they out executed you on whatever idea you claim.
They all do it now.
Intellectual property attorney should be high on the list. Its replaced sales as a way for company's to gain revenue. Just take the results of other company's sales. No matter that they out executed you on whatever idea you claim.
They all do it now.
more...
baryon
Jun 18, 01:57 PM
Woah... I would partition a 2TB SHXC card and use 1TB for Time Machine and the other half for Final Cut Pro! On freaking 3 square centimeters!
redscull
Apr 6, 01:27 PM
This is not what you originally said. You said "unless every normal person has a tech friend/relative to keep the tablet working/updated" which is something else altogether, about personal ability. Why else would they have to be a "tech friend". If it was only about having one "period" then any dumbo friend/relative with a computer would do.My argument wasn't changed, but it might have been misunderstood. If every normal person was using the tablet, it would only be their tech friends who had a true computer in addition to a tablet (my implication being that only techies maintain multiple personal computers). So all the normal tablet users would need tech friends for the occasional sync. Sure, any dumb friend with a reasonably modern computer would do, but the idea is that all these dumb friends have moved on to tablets as well. At some point, everyone has tablets, and only the techies have a second, "real" computer.
You can get apps without a computer. You can get music without a computer. You can get TV shows and movies without a computer. You can get mail without a computer. You don't actually need to sync anything. If you do not have anything to transfer over anyway (your "it's going to be the only one he's using" scenario), then you don't need a desktop.I'm wondering if you've ever tried a "never synced" iOS device. It actually is limited in some fairly annoying ways. For "normal" people, one of the biggest is the inability to manage photos. You can import them with the photo kit, but you end up with one stupid huge mess of pictures until you use a real computer to rearrange everything. You also miss out on updates. And maybe you think that's not important, but it becomes a big deal when all the apps you want to download start requiring a newer iOS version than what you have. It might take a year, but you'll reach a point where not syncing ends up meaning you can't get apps either. There's also a different mindset around backups for a portable device vs. a desktop. Justified, too; it's a lot easier to lose or wipe out the data on an tablet than a desktop computer. It needs wifi or even cloud-based backup without having to first sync to a computer.
You can get apps without a computer. You can get music without a computer. You can get TV shows and movies without a computer. You can get mail without a computer. You don't actually need to sync anything. If you do not have anything to transfer over anyway (your "it's going to be the only one he's using" scenario), then you don't need a desktop.I'm wondering if you've ever tried a "never synced" iOS device. It actually is limited in some fairly annoying ways. For "normal" people, one of the biggest is the inability to manage photos. You can import them with the photo kit, but you end up with one stupid huge mess of pictures until you use a real computer to rearrange everything. You also miss out on updates. And maybe you think that's not important, but it becomes a big deal when all the apps you want to download start requiring a newer iOS version than what you have. It might take a year, but you'll reach a point where not syncing ends up meaning you can't get apps either. There's also a different mindset around backups for a portable device vs. a desktop. Justified, too; it's a lot easier to lose or wipe out the data on an tablet than a desktop computer. It needs wifi or even cloud-based backup without having to first sync to a computer.
more...
jayducharme
Apr 21, 01:29 PM
I'm hoping that what this will mean is that the new iPhone will be able to do everything the iPad 2 can now do -- especially screen mirroring.
pubwvj
Apr 7, 05:56 PM
Never strive to be normal.
Still, although I enjoy using my PowerBook and iPodTouch I would love to have an iPad. Nice bit of engineering.
Still, although I enjoy using my PowerBook and iPodTouch I would love to have an iPad. Nice bit of engineering.
more...
nixd2001
Sep 14, 07:48 PM
Originally posted by onemoof
Someone asked the difference between RISC and CISC.
First thing, there isn't that distinction anymore. RISC originally meant that the processor had fixed width instructions (so it wouldn't have to waste time asking the software how big the next instruction will be). CISC mean that the processor had variable width instructions (meaning time would have to be taken to figure out how long the next instruction is before fetching it.) However, Intel has addressed this problem by making it possible for the processor to switch to a fixed-width mode for special processor intensive purposes. The PowerPC is stuck with fixed-width and has no ability to enjoy the flexibility of variable-width instructions for non-processor-intensive tasks. This means that CISC is now better than RISC. (Using the terms to loosely define Pentium as CISC and PowerPC as RISC.)
Originally it was Reduced versus Complex instruction set computer. Making simpler processors go faster is generally easier than making complex processors go faster as there is less internal state/logic to synchronise and keep track of. For any given fabrication technology, this still generally holds true. Intel managed to sidestep this principle by investing massive sums in their fab plants, effectively meaning that the fab processes being compared weren't the same.
The opposite end of the spectrum from RISC is arguably the VAX line. With this instruction set, massive complexities arose from the fact that a single instruction took so long and did so much. It was possible for timers, interrupts and "page faults" to occur midway during an instruction. This required saving a lot of internal state so that it could later be restored. There were examples of performing a given operation with a single instruction or a sequence of instructions that performed the same effect, but where the sequence achieved the join quicker because the internal implementation within the processor was able to get on with the job quicker because it was actually a simpler task being asked of it.
The idea of fixed sized instructions isn't directly coupled to the original notion of RISC, although it is only one step behind. One of the basic ideas with the original RISC processors was that an instruction should only take a single cycle to complete. So a 100MHz CPU might actually achieve 100M instructions per second. (This was often not achieved due to memory latencies, but this isn't the "fault" of the processor core). In this context, having a variable length instruction means that it is easy for the instruction decoding (especially if it requires more than one "word") to require for effort than any other aspect of executing an instruction.
There are situations where a variable width instruction might have advantages, but the argument goes that breaking the overall task down into equal sized instructions means that fetching (including caching, branch predicting, ec) and decoding these instructions becomes simpler, permitting optimisations and speed gains to be made elsewhere in the processor design.
Intel blur RISC and CISC into gray by effectively executing RISC instructions internally, even if they support the apparent decoding of CISC insructions. They only do this for legacy reasons.
Apple will never switch to IA32 (Pentium) because 32 bit processors are a dead-end and maybe have a couple years left. The reason is because they can only have a maximum of 4 GB of RAM [ (2^32)/(1 Billion) = 4.29 GB ]. This limit is very close to being reached in current desktop computers. Apple MAY at some point decide to jump to IA64 in my opinion, and I think they should. Obviously the Intel family of processors is unbeatable unless they have some sort of catastrophe happen to them. If Apple jumped on they'd be back on track. Unfortunately I don't believe IA64 is yet cheap enough for desktops.
I think this "unbeatable" assertion requires some qualification. It may be that Intel will achieve the best price/performance ratio within a suitable range of qualifications, but this is different from always achieving best p/p ratio whatever. Indeed, IA64 versus Power4 is going to be an interesting battle because Intel has bet on ILP (instruction level parallelism) whereas IBM has bet on data bandwidth. Ultimately (and today!), I think IBM's bet has more going for it. But that's if you want ultimate performance. The PC space is often characterised by people apparenntly wanting ultimate performance but actually always massively qualifiying it with severe price restrictions (such as less than 5 digits to the price).
Someone asked the difference between RISC and CISC.
First thing, there isn't that distinction anymore. RISC originally meant that the processor had fixed width instructions (so it wouldn't have to waste time asking the software how big the next instruction will be). CISC mean that the processor had variable width instructions (meaning time would have to be taken to figure out how long the next instruction is before fetching it.) However, Intel has addressed this problem by making it possible for the processor to switch to a fixed-width mode for special processor intensive purposes. The PowerPC is stuck with fixed-width and has no ability to enjoy the flexibility of variable-width instructions for non-processor-intensive tasks. This means that CISC is now better than RISC. (Using the terms to loosely define Pentium as CISC and PowerPC as RISC.)
Originally it was Reduced versus Complex instruction set computer. Making simpler processors go faster is generally easier than making complex processors go faster as there is less internal state/logic to synchronise and keep track of. For any given fabrication technology, this still generally holds true. Intel managed to sidestep this principle by investing massive sums in their fab plants, effectively meaning that the fab processes being compared weren't the same.
The opposite end of the spectrum from RISC is arguably the VAX line. With this instruction set, massive complexities arose from the fact that a single instruction took so long and did so much. It was possible for timers, interrupts and "page faults" to occur midway during an instruction. This required saving a lot of internal state so that it could later be restored. There were examples of performing a given operation with a single instruction or a sequence of instructions that performed the same effect, but where the sequence achieved the join quicker because the internal implementation within the processor was able to get on with the job quicker because it was actually a simpler task being asked of it.
The idea of fixed sized instructions isn't directly coupled to the original notion of RISC, although it is only one step behind. One of the basic ideas with the original RISC processors was that an instruction should only take a single cycle to complete. So a 100MHz CPU might actually achieve 100M instructions per second. (This was often not achieved due to memory latencies, but this isn't the "fault" of the processor core). In this context, having a variable length instruction means that it is easy for the instruction decoding (especially if it requires more than one "word") to require for effort than any other aspect of executing an instruction.
There are situations where a variable width instruction might have advantages, but the argument goes that breaking the overall task down into equal sized instructions means that fetching (including caching, branch predicting, ec) and decoding these instructions becomes simpler, permitting optimisations and speed gains to be made elsewhere in the processor design.
Intel blur RISC and CISC into gray by effectively executing RISC instructions internally, even if they support the apparent decoding of CISC insructions. They only do this for legacy reasons.
Apple will never switch to IA32 (Pentium) because 32 bit processors are a dead-end and maybe have a couple years left. The reason is because they can only have a maximum of 4 GB of RAM [ (2^32)/(1 Billion) = 4.29 GB ]. This limit is very close to being reached in current desktop computers. Apple MAY at some point decide to jump to IA64 in my opinion, and I think they should. Obviously the Intel family of processors is unbeatable unless they have some sort of catastrophe happen to them. If Apple jumped on they'd be back on track. Unfortunately I don't believe IA64 is yet cheap enough for desktops.
I think this "unbeatable" assertion requires some qualification. It may be that Intel will achieve the best price/performance ratio within a suitable range of qualifications, but this is different from always achieving best p/p ratio whatever. Indeed, IA64 versus Power4 is going to be an interesting battle because Intel has bet on ILP (instruction level parallelism) whereas IBM has bet on data bandwidth. Ultimately (and today!), I think IBM's bet has more going for it. But that's if you want ultimate performance. The PC space is often characterised by people apparenntly wanting ultimate performance but actually always massively qualifiying it with severe price restrictions (such as less than 5 digits to the price).
dcv
Oct 26, 12:14 PM
so who's got a tshirt then?
more...
ehoui
Apr 27, 05:56 PM
Would you want Donald Trump as our president? I am really on the fence for this one. This is the first time I am allowed to vote so I am paying attention to politics more than ever. I mean it would be good to have a completely loaded president so he "can" spend some of his money for the economy, which Trump says he will do. We all know that all politicians are liars though. Trump has changed his mind about a lot of things such as abortion, socialism, and Obama's health care plan. A little while back Trump voted for the three previous things I said but has recently changed his mind. This makes me think he will be unstable when it comes to decisions as a president. Also I don't take him seriously. So what do you think about Trump for president?
It should not matter what "I think" about Trump as it relates to your vote. But, I think you are trying to get a deeper sense of the candidate by asking others, which seems both reasonable yet misguided to me. My suggestion is that, at the end of the day, vote your conscience based on what HE SAYS and DOES and not what others interpret. You have a moral compass, use it.
It should not matter what "I think" about Trump as it relates to your vote. But, I think you are trying to get a deeper sense of the candidate by asking others, which seems both reasonable yet misguided to me. My suggestion is that, at the end of the day, vote your conscience based on what HE SAYS and DOES and not what others interpret. You have a moral compass, use it.
rovex
Apr 27, 07:19 AM
He's better than the McCain would have been in most things, but on a whole he gives in to the Republicans way too much. He passed their healthcare plan instead of one that would actually work, kept Guantanamo open, and as far as "National Security" goes he's about the same as Bush. So basically, he was the best of the two choices, but still not very good.
"yes he can" lie.
Obama is a joke. False hope and the naive people didn't think so.
Trump's hair is seriously the mojo. Love it.
"yes he can" lie.
Obama is a joke. False hope and the naive people didn't think so.
Trump's hair is seriously the mojo. Love it.
more...
LagunaSol
Apr 12, 04:15 PM
Pfft, this whole "iPad fad" is going to fade away now any day now. Right guys? ;)
bizzle
Apr 11, 06:14 PM
I paid $3.50 today 87 in NJ. I fill up every three days for work. I get around 27-28 mpg in my 05 Civic Si.
more...
skunk
Apr 14, 04:24 PM
Something I am seeing more and more which is downright terrifying/befuddling to me is the notion that not giving someone something is the same as taking something from them. Example: Tax cuts. I hear time and time again that tax cuts "cost" the government money. Excuse me?I don't really understand your confusion here: the government is essentially selling the taxpayer a bundle of services. If they lower the price, it costs them money. Surely that isn't so complicated?
redeye be
May 25, 11:42 AM
In that case, bring it on, I eat punks like you for breakfast! :D
Maybe this should be a new feature for the folding widget: to look when you will be overtaken by someone or when you overtake someone.
Actually, I should be able to do it, you would have to choose your targets yourself though. I will not be able to provide you with the closest threats and/or overtakes, but if you know who you want to track, it's not that hard to show/calculate.
I'll first clean up the code, add detailed stats, Then i'll redesign the layout and incorporate the threats/overtakes (this might take a while - busy period @ work).
Now if someone would post this widget in this (http://forums.macrumors.com/showthread.php?t=128541) thread i'd be a happy man. I would do it myself, in fact i almost did...
Come to think of it, asking is the same as posting it myself but... bah.. lol prrt <over and out> sry
;)
Maybe this should be a new feature for the folding widget: to look when you will be overtaken by someone or when you overtake someone.
Actually, I should be able to do it, you would have to choose your targets yourself though. I will not be able to provide you with the closest threats and/or overtakes, but if you know who you want to track, it's not that hard to show/calculate.
I'll first clean up the code, add detailed stats, Then i'll redesign the layout and incorporate the threats/overtakes (this might take a while - busy period @ work).
Now if someone would post this widget in this (http://forums.macrumors.com/showthread.php?t=128541) thread i'd be a happy man. I would do it myself, in fact i almost did...
Come to think of it, asking is the same as posting it myself but... bah.. lol prrt <over and out> sry
;)
GeekLawyer
Apr 21, 12:51 PM
I suspect the next iPhone, released in June, July, or September will be largely unchanged from the 4. An A5, sure. Maybe higher storage capacities. A "world" model, from what the Verizon exec said. Black or white. That's about it.
hagjohn
Apr 20, 01:18 PM
Is it me or is Apple becoming a silly caricature of its own 1984 ad?
It's not just you.
It's not just you.
brett_x
Sep 20, 08:41 AM
nothing for the powerbook g4s?
What about my SE/30? Nothing???
What about my SE/30? Nothing???
plinx0r
Jan 6, 04:00 PM
has anyone tried syncing the contacts yet? i'm curious what information gets pulled down and tied to a contact besides the profile picture and "links."
zin
Apr 28, 05:26 PM
Amazing, and then what? Maybe use it twice in your machine's life?
ThunderBolt enables faster read rates than a DVD. That's why.
ThunderBolt enables faster read rates than a DVD. That's why.
BC2009
Mar 25, 11:00 AM
Before all you Apple fannies disagree with this; just remember Apple is trying to sue everyone else too.
It's all ridiculous.
"Apple fannies" -- I like that.
Seriously, the amusing part of this is that patent trolls are usually companies who never produce anything based on those patents. Sadly, Kodak is a company that once produced decent stuff, but is now essentially acting like a patent troll because they don't really produce squat anymore. Considering that Kodak is busy liquidating entire manufacturing sites, it would be amazing if this company ever made comeback (even if they won $1B from Apple).
It's all ridiculous.
"Apple fannies" -- I like that.
Seriously, the amusing part of this is that patent trolls are usually companies who never produce anything based on those patents. Sadly, Kodak is a company that once produced decent stuff, but is now essentially acting like a patent troll because they don't really produce squat anymore. Considering that Kodak is busy liquidating entire manufacturing sites, it would be amazing if this company ever made comeback (even if they won $1B from Apple).
No comments:
Post a Comment