-
Posts
762 -
Joined
-
Last visited
Content Type
Forums
Stories
- Stories
- Story Series
- Story Worlds
- Story Collections
- Story Chapters
- Chapter Comments
- Story Reviews
- Story Comments
- Stories Edited
- Stories Beta'd
Blogs
Store
Gallery
Help
Articles
Events
Everything posted by Rilbur
-
While the conversation on AIDS is probably useful (yes, it's still happening, yes, it's deadly, yes you're an idiot if youignore it), we're starting to drift a bit. To touch back on the original topic (or at least the topic title...), does being gay mean being promiscuous No, not automatically. But it's an assumption that's actually halfway logical. To be blunt, men have higher sex drives than women (especially in the 15-25 age bracket), and a much lower reluctance (cost of entry, whatever) than they do to -- no pregnancy to worry about. Men are more interested in sex, men are more driven to seek it out, and men are less likely to shy away from it for various reasons. On top of which, while women in general have an emotionally focused sex drive (relationship to partner is what's important), men are much more strongly signaled by physical clues -- which is to say, a woman wants to please her man, a guy gets going because he sees something 'fun'. It's no surprise that gay men tend to be more promiscuous -- tend to have more sex -- because their partner is going to be just as interested in 'getting it on' as they are, rather than being the restraining force. Nothing immoral about it, just applied biology and sociology.
-
Actually, it doesn't necessarily mean an underlying software issue, it just means they have a rapid patching cycle -- they tend to release their fixes quicker, rather than let them sit on their drive. Chrome does the same thing, it's just much, much more circumspect about it.
-
My biggest concern with that is that rapid update cycles are actually fairly important, especially for security. There may not be much, if any, time between when researchers discover a security flaw, and it's discovered 'in the wild' -- used against YOUR machine, as likely as not. Edit: And I use Microsoft Security Essentials for my AV; works great.
-
It's not just a matter of glare. LCD based screens tend to look washed out in strong (ready: sun) light. To a degree, iPads and iPhones (and some other devices) compensate by automatically adjusting screen brightness based on ambient illumination. Unfortunately, they can only do so to a degree. The back-lights included in the devices can only push so much light themselves, and the more light they push, the more power they drain. That said, I love my iPad.
-
Honestly, all the best games are on the PC. Antechamber, Braid, Space Pirates And Zombies, FTL, World of Goo... I'd continue, but then I realized that Fez was an Xbox eclusive, and XCom was on multiple platforms.
-
Now now now James, repressing all that anger isn't healthy. Just let it alllll out. That said, don't forget that some of those problems have more to do with the technology of the day. Calling is a 'POS' is only true by modern standards.
-
'Too soon'? That's a joke, right? It's been over seven years! Technology has moved and left the PS3 far behind! Furthermore, with the new Wii U out, and the new XBox system coming soon, they more or less have to release in this timeframe. (I'm talking in terms of years, here, not days or months)
-
This I did not know. That said, the basic point -- if they'd been running at a safer speed, rather than rushing to get there 'on time' -- they probably would have been able to turn quickly enough.
-
The Titanic was a disaster because of arrogance, not because of bad design. If they hadn't run the ship so fast they couldn't avoid iceburgs... if they'd had enough lifeboats for all the passengers... Hell, if they hadn't panicked they'd have gotten many more out alive, for crying out loud! The Titanic's sister ships didn't sink thanks to bad design. One lived out it's lifespan and was retired, the other was sunk during world war 1 by hostile action.
-
It's not the transferring of information between cores that's the issue -- we've got that licked. (Mutexes semaphores and check-and-sets oh my!) It's a bit of a pain, programmatically speaking, but the tools are there if used correctly. They've existed for years, because most of them were in use before multiple cores existed! (They were utilized back when pre-emptive multitasking became possible, allowing the operating system to avoid 'surrendering control' of the system hardware to guest programs) The issue is that programmatically, humans suck -- and suck hard! -- at doing the work involved. And our egg-heads are still working on ways for machines to take our code and multi-thread it for us at the compiler level. Its difficult to even explain the issues involved with multi-threading without a blackboard and a couple of hours to work things through. Solving those issues? Fun-fun-fun! Not because it's impossible, but because it's easy to get wrong -- and almost humanly impossible to get right. And that's after you figure out how to break things out into multiple threads. Lets try this a different way... I'm willing to bet that when you work on something you do often, you have a reasonably defined pattern of how you go about it. Step one, step two, step three, step four, and so on. Some tasks you could easily break up amongst multiple people, have one person take this set of steps while someone else does some others. But what happens when each step is dependent on the step before it? That's the issue that's preventing full use of multiple core machines -- though we're getting some great results simply from being able to run the OS on one core, background programs on another, and then having two spare cores for whatever the user is actively doing.
-
The reason GPUs won is because graphics (and eventually physics) are what are reffered to as 'infinitely parallelizable' tasks. It's very easy to break graphics and physics calculations down into separate tasks that can be performed independently -- something GPUs are explicitly designed for. Most other aspects of games are too heavily built around being done in serial, step one then step two. That's one of the big reasons CPU power has started to stagnate (and become less relevant) in recent years. Processors have reached a wall of sorts in increasing their raw speed, even as we've started being able to build CPUs with multiple cores. The issue is almost all our programming is built around a single-core approach; programmers are still trying to come up with ways to work across multiple cores -- something that is VERY difficult, and horrendously error-prone. As far as I know, the best anyone has done with that is splitting out separate components of a game across multiple threads -- AI, music, etc each getting their own thread. There's a limit to how far you can do that before each component is independently threaded, and you don't get any further gains (especially since the actual game state updates generally can't be threaded, unfortunately).
-
Yes, but imagine if developers had been able to figure out a way to make good use of seven processor cores...
-
It's not that the power could never be fully utilized, it was that the architecture driving that power was so difficult to make good use of, most people never bothered. I'd argue that the air force, which has a 'supercomputer cluster' consisting of a couple hundred networked PS3s, is making 'full use' of that power. (It was actually cheaper to install linux on the PS3s than build a dedicated supercomputer cluster -- more horsepower, less price. The only issue was when Sony decided it didn't want linux on PS3s...)
-
As far as fashion goes, there are plenty of people out there who don't have a fashion sense -- look at all the people who wear bluetooth headsets like they're a fashion accessory. Productivity is important to some, and this thing is the ultimate (well, almost) in hands-free productivity. I could very easily see some artists use it in a manner similar to the sculptor in the sample video. Better yet, prescription lenses are available for it, from what I was just able to research. Finally, while a 'better designed' successor could probably do a lot of damage to it, I'm not going to bet against google being able to remain in the market. iPhones are in many ways superior to android, but Android is still out there, going strong. And the next-best competitor is blackberry, who really just can't keep up (and never could outside of business). On a recent family trip, I became 'Mr. Information' because I could quickly and readily look things up on my iPad when they came up in discussion; this thing would be a step past that. The price tag is high for something that in many ways is a novelty, but if the technology improves, I could readily see this replacing tablets for many functions. In the end, your final point is probably correct however: the most important part of this is that Google is pushing a technology that will become common place. The possibilities, especially once the price drops, are just too big to ignore. Imagine a reactor facility where, in addition to the panel after panel of controls, every worker had a personal display that could show something important to them. Imagine on a military vessel, where a 'personal HUD' could readily enable a computer to help co-ordinate crew to handle damage control -- or on a civilian vessel, where crew members could now dynamically direct passengers to lifeboats with room, rather than just hoping that they're sending the right number to each boat. And those are just what I can think of, and history has shown that in the end, we can never really imagine the impact a technology will have until after we're living it.
-
Wow, bitter much? It sounds like having windows come pre-installed on new machines is half the problem -- because the tutorials that are supposed to help new users learn things things like bringing up the charms bar are, from what I've read, on the install screens. So no install screen, no tutorial to the new UI. That said, it's worth noting that some of what he's complaining about stems from the attempts of windows to expand user functionality. The new start screen is designed to let apps display info without having to open them up. Want to check the weather? One press of the windows key brings up the start screen, where you can see that it's expected to rain today, and you can see your latest twits, facebook junk, and maybe some of your mail. It's not perfect, yet, but it's an effort to take their OS in a new, more user-friendly approach. Windows 8 is Windows Vista 2.0: it'll be good, eventually, but people need to get used to the ideas involved. (And MS needs to patch the heck out of it). Windows 9 will probably be based on a similar system, but it'll be polished to the point where it's much more usable. Oh: and big reason I want to upgrade? Improved multi-monitor support!
-
As a programmer, I'm not sure how far you can do that without adjusting the base code for the game.
-
I believe there is software out there than can do that.
-
Five crew killed in Canaries cruise ship safety drill
Rilbur replied to hh5's topic in C James Fan Club's Topics
HH5, your translation software is betraying you worse than usual -- at least, I can only assume it was the action of translation software that added 'pseidon adventure' to that phrase. A rogue wave is a wave much larger than the other waves occurring in the same place at that time. Which is to say, if over a period of time you measured wave heights of 5 feet, a 15 foot wave would be a rogue wave. If you're measuring wave heights of 15 on average, a 45 foot wave would be a rogue wave. -
Five crew killed in Canaries cruise ship safety drill
Rilbur replied to hh5's topic in C James Fan Club's Topics
Ouch! There's a reason for safety drills, but I think equipment inspection is supposed to come first... Going to be hard to convince anyone involved in that cruise to do their safety drills now... -
I've recently 'updated' from LibreOffice to Scrivener -- and I love it. Just love it.
-
You are right that it's her call, but I'd say she should. It's probably going to about as pleasant as a root canal, without anesthetic but it's still important. Worst case scenario, there's insufficient evidence to 'do' anything but the guy gets a mark on his record. A mark that may help one day, when a pattern starts to emerge. But there's the opportunity for much more than that. I did a fair bit of research when I was writing The Guardians, and one thing I read really stuck in my mind. Many rapists try to use the trial as an opportunity to rape the victim again -- and the article in question strongly urges women to use the trial to rape the rapist back. Look him in the eye, metaphorically spit in his face, and call the bastard on his behavior. Make the trial an exercise in your power over him. If nothing else, he's not going to be happy being called in and called a rapist -- and she did it to him. It's a way to regain power. Which is the important thing to remember in cases of rape: it's about power. Having power taken from you, and if you want to recover, go take that power back. The road to recovery is long, and much like an addicts, it's fraught with pitfalls. It's not about getting to a place where you've 'dealt' with the rape. It's about putting one foot in front of the other, putting it behind you and dealing with it. Days will come when everything falls apart, but you'll climb back up that much faster. That much better, stronger.
-
Um... What the heck is a pog? Wait... I think I missed that age group by about a year, lol.
-
Is there a facebook friendly version of that little bit of art anywhere? LOL
-
I doubt it -- I've never even heard that particular myth before. Still, if you're concerned, I have to admit you cannot beat the 'Apple Store Experience' -- and it is an (incredible) experience.
-
Pretty much my first thought as well.
