Researchers use cell-phone data, not precogs, to predict crime in London

|black|building|bus|car|center|city|color|corner|crossing|day|downtown|england|europe|european|exterior|gray|horizontal|intersec

Just this year we’ve seen open data give rise to recreations of Denmark in Minecraft, the ability to compare cities at the same scale and also collections of geo-mapped tweets and traffic lights. But what about a practical application for all of that info, one that has a more tangible benefit to society, like, say, crime prediction? That’s what the University of Trento in Italy had in mind with its "Once Upon a Crime" study. The researchers coupled freely available (and anonymous, aggregated) demographic and mobile phone data with real crime data to forecast where in London an infraction might occur. Just how accurate was it? The Italian scientists say that their predictive algorithm was on-point, accurately anticipating whether an area would have either high or low levels of vice, 70 percent of the time. No, it’s not quite enough to let Chief Anderton and co. start running wild just yet, but it could be a way to help cities struggling with budget woes decide what areas need more (or fewer) police patrols.

[Image credit: Getty Images]

Filed under:

Comments

Via: Slashdot

Source: Arxiv (PDF)

http://ift.tt/1tsI4NG

Source: Engadget Full RSS Feed http://ift.tt/1tsJ8RL

Animation shows the transformation of a woman’s body throughout life

Animation shows the transformation of a woman's body throughout life

This is great. Celia Bullwinkel made Sidewalk, an animation that shows a little girl become a woman while walking on the sidewalk and all that it entails. There’s a perfect amount of awkwardness and embarrassment, annoyance at rude gawkers and a whole lot of hard work.


SPLOID is a new blog about awesome stuff. Join us on Facebook

http://ift.tt/XPIP7b

Source: Gizmodo http://ift.tt/1tsIB29

Heatlh record providers Cerner and Athenahealth working with Apple on HealthKit apps

Heatlh record providers Cerner and Athenahealth working with Apple on HealthKit apps

Athenahealth and Cerner are working on apps that integrate with Apple’s HealthKit platform.

http://ift.tt/Xizft6

Source: App Advice http://ift.tt/1ugrIWz

iFixit’s Live iPhone 6 Plus Teardown Finds Oversized Battery

iFixit's Live iPhone 6 Plus Teardown Finds Oversized Battery

iFixit, famous breakers of things, were Down Under to scoop up an iPhone 6 Plus as soon as it went on sale. Now, they’re teasing the rest of us with a step-by-step liveblog of the teardown process.

The disassembly isn’t the simplest — you have to extract two proprietary Pentalobe screws, and then lever the entire front display assembly away from the rest of the body with a suction cup, being careful not to rip the TouchID sensor wire clean off.

The 6 Plus is rocking a 2915mAh battery, which is nearly double the 1560mAh cell in the iPhone 5S. However, compared to other 5.5-inch phones, it’s a bit of a disappointment: the OnePlus One is packing a 3100mAh cell, and the Galaxy Note 3 (which, admittedly, is a bit better endowed in the size department) has 3200mAh up its sleeve.

We’ll keep an eye on the live-stream and let you know if they find any tiny Jony Ive action figures hidden inside the RAM modules. [iFixit]

Image credit: iFixit

http://ift.tt/YWNsh8

Source: Gizmodo http://ift.tt/1sytq3r

Malaysia’s tech manufacturing sector based on forced labor


"Hardly a major brand name" doing business in Malaysia is untainted by the use of forced labor from trafficked workers, according to a study backed by the US Department of Labor.
Read the rest

http://ift.tt/ZsXUgj

Source: Boing Boing http://ift.tt/ZsXVRg

Google to encrypt data on new version of Android by default

Encryption has been optional since 2011, but Android L, due out later this year, will include activation procedures for automatic encryption.



http://ift.tt/1tstyWr

Source: CNET News http://ift.tt/1uglZ2N

NVIDIA’s new GPU proves moon landing truthers wrong

Despite overwhelming evidence to the contrary, there still exist some people on planet Earth who believe it’s the only celestial body humanity has ever walked upon. You’ve heard it before — the moon landing was a hoax, a mere TV drama produced by Stanley Kubrick presented as fact to dupe the Soviet Union into giving up the space race. This deliciously ludicrous conspiracy theory has been debunked countless times, but now its advocates have one more refutation to deny: NVIDIA’s Voxel Global Illumination tech demo. It’s a GPU-powered recreation of the Apollo 11 landing site that uses dynamic lighting technology to address common claims of moon-deniers, and it’s pretty neat.

Mark Daly, NVIDIA’s senior director of content development told Engadget its Apollo 11 demo was created as an answer to Sponza — a popular global illumination model frequently used in by the academic crowd. It’s a good model, he says, but it’s not very interesting to watch. "Jen-Hsun [Huang], our CEO, looked at it and said ‘Isn’t there something better?’ Anyway, one of our research engineers happened to put this slide up of Buzz Aldrin on the moon in a meeting and said ‘this speaks global illumination to me because of all the hoaxers and deniers of the moon landing." Conspiracy theorists say that Aldrin simply couldn’t have been lit up the way he is in the picture. NVIDIA took it as a challenge.

Buzz Aldrin (right) next to his computer-generated doppelganger (left)

NVIDIA chose to create a 3D rendition of a photograph showing Buzz Aldrin descending a ladder to the moon’s surface. Folks that insist the landing was a hoax claim that without the light-diffusing effect of an atmosphere, the shadow of the lander should cast Aldrin in almost complete darkness. "You can explain it," Daly says, "and say light bounces around even on the moon… or you can show it. We decided to take the approach to show it, but it turns out that it’s not that easy — there isn’t a lot of light on [Aldrin]." Daly’s challenge was not in placing lights around a computer simulated scene of the Apollo 11 landing, but in using NVIDA’s Voxel Global Illumination to make a single light source, the simulated sun, correctly reflect off of every material in the scene. To do this, he had to research the materials of NASA’s lander, the brightness of our local star and even the reflectivity of the moon’s surface.

"It turns out there is a lot of information about the astronomical bodies floating out there in space," he explains. "Starting with the sun. The sun itself is 128,500 lux — that’s lumens per square meter – but it turns out the moon is a crappy reflector of light." Daly discovered that the moon is only 12-percent reflective, and absorbs most of the sunlight hitting it. On the other hand, 12-percent of 128,500 lux is quite a lot. "It’s the equivalent to ten 100-watt lightbulbs per square meter of light bouncing off the moon." More than enough make Aldrin visible under the lander’s shadow.

While this exercise showed that the moon was reflective enough to highlight Aldrin, something was still wrong. Daly noticed that the astronaut’s side wasn’t lit the same in NVIDIA’s simulation as it was in NASA’s photograph, but he wasn’t sure why. "A couple of people really into the moon landing told me, ‘by the way, you should take into account Neil Armstrong and the light coming off of him.’ At first I was like, yeah, whatever — the sun is doing all the work — something the size of a guy in a space suit isn’t going to contribute much light." He quickly learned his assumption was wrong: the material on the outside of the astronaut’s suits is 85-percent reflective. "Sure enough, we put him in there, adjusted the reflectivity of his suit, put him in the position where the camera would be… and it contributed another 10% or so of light to the side of Buzz Aldrin."

Daly found that his own doubt mirrored the claims of some landing-deniers. Some claim that because Aldrin is in shadow, there would need to be some sort of auxiliary lighting behind the camera; supposed proof that the image was taken in a studio. "As it turns out, yes! They’re right — there was a light there, it was the sun reflecting off of Neil Armstrong’s suit. I really didn’t believe it would contribute that much."

It’s the dynamic nature of Voxel Global Illumination that allows NVIDIA to poke fun at these hoax claims: the entire scene renders light reflection on the fly, based solely on the illumination provided by the simulated sun. "We learned a heck of a lot about how all these materials reflect light and put them into the material descriptions, the BRDF (bidirectional reflectance distribution function)," Daly said, explaining how developers create a VXGI lighting environment. "The VXGI we’ve integrated into Unreal Engine 4 reads all those materials you’ve given it and, based on the reflectivity of those materials, constructs a lighting module." It’s a lot of work to set up, but it makes adjusting the light easy after the fact. NVIDIA is able to drag the sun to new positions, add new elements to the scene or even remove the moon’s natural reflectivity to create the false conditions moon-truthers think represent the lunar surface.

This versatility allowed NVIDIA to address one more hoax-claim before our demo ended: the stars. If NASA really landed on the moon, why can’t we see the stars in any of the Apollo 11 photographs? Well, that’s more of a matter of film exposure than lighting trickery. Because the unfiltered sun is so ridiculously bright (128,500 lux, remember?), the astronauts’ cameras were set to use a small aperture, letting in only a fraction of the available light in order to keep the picture from blowing out. NVIDIA was able to simulate this too, and widened the virtual camera’s aperture to reveal the demo’s simulated stars. It worked, but at the expense of the camera’s true subject matter: Aldrin’s descent to the lunar surface became a blown out, over-exposed mess.

Science has been able to debunk these moon hoax theories for decades, but it’s nice to see a real-time simulation that can help illustrate those explanations in real time. Better still, Daly says NVIDIA is currently building a consumer UI for the demo, and will release it to the public sometime in the next several weeks. It’s also a project that has become important to him. "Because I got to see a lot of this live when I was a kid, it has a special meaning to me. I know in Apollo 1 two men died, and other men risked their lives to get into these crazy contraptions to actually do this. It’s kind of offensive to me when people say this didn’t happen," he explains. "I want to show that it really happened and these people risked their lives. They actually did go to the moon."

Comments

http://ift.tt/1mkhky5

Source: Engadget Full RSS Feed http://ift.tt/1syoknQ

NVIDIA’s latest GPU crams 4K images on 1080p displays

Back in February, NVIDIA trotted out the very first desktop GPUs to feature its new Maxwell architecture: the GeForce GTX 750 and 750i. These entry level cards were paragons of efficiency, but they were hardly strong examples of what the company’s latest graphics technology was truly capable of. No, NVIDIA revealed those graphics cards today — the GeForce GTX 980 and 970 desktop GPUs. The new flagship GPUs still benefit from the efficiency gains made by the first generation Maxwell cards, but lean far more heavily on performance. If you’re a PC gamer with a GTX 680 or 560 in your tower, these are the cards NVIDIA wants you to upgrade to.

On paper, there’s reason enough to appreciate these cards’ power: the $549 GTX 980 boasts a 1.1Ghz base clock speed (1.2 with boost), 2048 CUDA cores and 4GB of GDDR5 video memory. The $329 GTX 970 sheds a few of those CUDA cores (totaling 1664) and clocks down to 1Ghz (1.1 with boost), but it consumes a little less power for the downsizing: 145W to the 980’s 165W. In NVIDIA’s tests (viewable in the gallery above), these stats reportedly outperformed AMD’s kit with almost half the power draw. Still, even NVIDIA knows stats and core count mean bupkis to the general consumer — gamers want to know what all these specifications are going to do for them. We met up with Scott Herkelman, NVIDIA’s general manager of GeForce, to learn about Maxwell’s new tricks.

"One of the things that we thought about when we wanted to launch Maxwell is this dichotomy that gamers are running into today," Herkelman told Engadget. NVIDIA found that gamers either wanted to increase visuals past a game’s prescribed performance settings or maximize framerate without sacrificing image quality. Surprise, surprise: Maxwell’s second generation GPUs introduce two new technologies that can help.

Dynamic Super Resolution, for instance, lies to your game to make it output a higher resolution than your display expects. "We render a 4K image in the background and then put it through a 13 gaussian filter," he explained. "Then we bring that down to a 1080p monitor." As far as the game is concerned, its piping out a ultra high resolution image to a 4K monitor, but Maxwell is forcing it to run on you 1080p display. This feature is designed to improve picture quality on a game that is already tuned to its best visual settings. Basically, it makes downsampling easy. It looks pretty good in action too, but it isn’t perfect: some 4K UI elements don’t scale well on smaller monitors. Herkelman says NVIDIA is continuing to improve and tweak the feature.

"The other new technology we have is called MFAA, or Multi-Frame Sample Anti-Aliasing," Herkelman said. "This is for those games where you already have great image quality but you want more performance." Like traditional anti-aliasing, it can sample a pixel multiple times, but MFAA splits the work up over multiple frames. Herkleman says this can improve performance by as much as 30-percent.

Finally, high-end maxwell cards will be able to take advantage of games that use Voxel Global Illumination, a new dynamic lighting technology that promises to promises to enable destructive environments with active, realistic lighting. NVIDIA says the new lighting solution will be available for UE4 and other major engines later this year.

Not the bells and whistles you’re looking for? Fine — Maxwell has a few more features hidden away, but you won’t be able to use them until the consumer virtual reality market takes off. NVIDIA’s VR Direct program is working to bring low latency graphics to consumer VR headsets like the Oculus Rift. Herkleman showed off a Maxwell-powered Eve: Valkyrie demo as an example. Indeed, the demo was smooth, but VR Direct’s future impact on GeForce Experience really caught our attention. In addition to supporting SLI, DSR and MFAA, NVIDIA’s VR Direct promises "auto stereo," a feature designed to bend a game not intended for virtual reality into the Oculus Rift’s stereoscopic perspective. Herkleman told us that the feature would probably have a whitelist of compatible games, not unlike how the company implements NVIDIA 3D Vision.

So, when can consumers get their hands on the new Maxwell? Soon. NVIDIA CEO Jen-Hsun Huang officially announced the new GeForce GTX cards at Game24 this evening, and they should be available for sale tomorrow morning from NVIDIA’s usual hardware partners: EVGA, ASUS, Gigabyte, MSI and PNY, among others. Are you planning to upgrade, or will you wait to see what AMD cooks up in competition? Let us know what you think in the comments section below.

Filed under: ,

Comments

http://ift.tt/1ugjM7E

Source: Engadget Full RSS Feed http://ift.tt/1qPDgSS

Nvidia GeForce GTX 980: The Beast That Sips Electricity

Nvidia GeForce GTX 980: The Beast That Sips Electricity

Over the past year, PC graphics cards have swelled to gargantuan proportions, with price tags to match: Nvidia’s GeForce GTX Titan costs an incredible $999, to say nothing of the $3,000 Titan Z you might consider if your family is in the oil business. Today, Nvidia’s trying something different: the new GeForce GTX 980 is not only one of the fastest* cards Nvidia has ever built, it’s also incredibly efficient.

Starting at a way more reasonable $549, the GTX 980 also only has 5.2 billion transistors—far fewer than the 7.1 billion you’d find in a current-gen GTX 780 Ti or GTX Titan—and its 2048 CUDA cores and 256-bit memory interface wouldn’t seem to stack up well against the 2880 cores and 384-bit bus you’d find in the company’s former 780 Ti flagship. Indeed, it doesn’t have quite the same texture fill rate and there’s a good bit less memory bandwidth.

But the new Maxwell-based GPU inside this card is clocked at a practically unheard-of 1126 MHz (compare to 875MHz), the card has 4GB of GDDR5 memory (up from 3GB), and most impressively the entire kit runs at just 165 watts TDP. Compare to the power-hungry 250W GTX 780 and Titan series, and you can see we’re talking about a very different beast.

Perhaps more interesting is what you (and game developers) will be able to do with the new Maxwell-based graphics card thanks to some new Nvidia techniques. With Dynamic Super Resolution, you can make games look way smoother on your 1080p monitor by actually rendering them at 4K resolution and letting the GPU seamlessly scale them to your smaller display. (It’ll run way slower, of course, but could be great if you have the extra frames per second. Click the Expand buttons on the images below to get some idea of the difference.)

Nvidia GeForce GTX 980: The Beast That Sips Electricity

Nvidia GeForce GTX 980: The Beast That Sips Electricity

Multi-frame sampling anti-aliasing (MFAA) should give you the reduced jaggies of 8x MSAA with closer to the cost of 4x MSAA. And with Voxel Global Illumination, developers will be able to build games with way prettier, more realistic dynamic lighting that actually changes based on actions the players take. That one will be baked into Unreal Engine 4, and we may start seeing games with the feature early next year.

And last but not least, Nvidia’s making a whole host of improvements specifically for virtual reality headsets like the Oculus Rift, including reducing the latency between your head movements and the action you see on screen so you’re not as likely to get sick, and (Nvidia claims) automatically making games playable that weren’t actually designed to be playable in virtual reality. Nvidia will now also support VR headsets with dual-graphics-card SLI setups, with one GPU rendering images for each eye.

Nvidia GeForce GTX 980: The Beast That Sips Electricity

If $549 is still too rich for your blood but you want those features, you’ll have another option beside the GTX 980. You can get the same Maxwell benefits, only roughly 10-20 percent slower on average, with the $329 GTX 970. The two new GPUs should be available today, and entirely replace the GTX 780 Ti, GTX 780, and GTX 770. If you want something cheaper, the GTX 760 will now retail for $219.

*We’ll add impressions from graphics card reviewers to this post as they roll in. It’s not clear whether this card is actually faster at games than the previous generation, though it definitely has a faster clockspeed.

http://ift.tt/Xrsntt

Source: Gizmodo http://ift.tt/XrsnJL

Seeing the Aurora Borealis in real time is better than any time lapse

Seeing the Aurora Borealis in real time is better than any time lapse

We’ve seen so many beautiful time lapses of the Aurora Borealis (not that I’m complaining!) that it’s pretty refreshing to see how ethereal it can be in real time too. I guess that means I should really see it in real life.

The video was filmed by O Chul Kwon in Yellowknife, Canada. If I can’t go there, I want to project this onto a wall and fall asleep to it every night.


SPLOID is a new blog about awesome stuff. Join us on Facebook

http://ift.tt/1mkeN6Q

Source: Gizmodo http://ift.tt/1tslAfW