Total Soft Tech
  • Home
  • Products
    • Acqua
    • Houra
    • Med+
    • Vitta
  • About Us
    • Total SoftTech Company Profile
    • Who We Are
    • Management Set Up
    • Management Team
    • Words from the CEO
  • Careers
  • Contact Us
  • Privacy Policy
  • Home
  • Products
    • Acqua
    • Houra
    • Med+
    • Vitta
  • About Us
    • Total SoftTech Company Profile
    • Who We Are
    • Management Set Up
    • Management Team
    • Words from the CEO
  • Careers
  • Contact Us
  • Privacy Policy

Category : Software News

HomeArchive by Category "Software News"
Mobile Game Development India

Games and diversions dominate mobile apps use

by Soloiston 5 April 2013in Software News No comment

Smartphone, tablet users are launching more apps than they did two years ago, app measurement firm says.

The average person in the U.S. spends 2 hours and 38 minutes a day on smartphones and tablets.

Flurry, an app measurement and advertising platform, has released some interesting data showing how people spend their time on iOS and Android devices, and the implications of that use.

App use accounts for 80% of time spent using the devices, with balance through a browser, the company reported in a blog post.

mobile_app_chart_508

 

 

When it comes to apps, 32% of a consumer’s time is spent playing games – the largest segment. This is followed by Facebook use at 18% and “entertainment” at 8%. The balance of time is divided among multiple categories including “utility” and “entertainment,” each at 8%. Productivity uses accounted for the smallest share at 2%.

Consumers are also using more apps, said Flurry. Over the last three years, the average number of apps launched per day by consumers climbed from 7.2 in 2010 to 7.9 in 2012.

“Assertions that people are using fewer apps in 2012 than they did in 2010 appear to be incorrect,” wrote Simon Khalaf, president and CEO of Flurry.

But Khalaf also sketches out a very dynamic market. In the last quarter of 2012, 63% of the apps were new, “and most likely not even developed in 2011” or if they were developed, poorly adopted.

“We believe that with consumers continuing to try so many new apps, the app market is still in early stages and there remains room for innovation as well as breakthrough new applications,” wrote Khalaf.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
A-concept-of-HTC-smartphone-Facebook

Will more smartphones support Facebook Home?

by Soloiston 5 April 2013in Software News No comment

So far, Facebook puts native Home on HTC First, while six more smartphones can run the app; its unclear whether it will attract more phone makers, analysts say.

The HTC First smartphone will have native support for Facebook Home when it ships on AT&T April 12. Some analysts wonder how soon — or whether — native support for the app will be added to more smartphones.

Given Facebook’s enormous following of 1 billion-plus users, the odds seem likely that other Android phone makers will look to cash-in on the native Facebook Home screen opportunity, some analysts said in interviews Thursday.

No other native Facebook Home phones were announced today, and several Android device makers wouldn’t comment on their plans for the app.

Facebook made it clear that open source Android is the best platform for Home, and analysts said it’s highly unlikely that it will ever run on iOS, Windows Phone or BlackBerry because those platforms are locked down.

So far, Facebook said in a statement, the Home app is supported on the HTC One X, the HTC One X+, Samsung’s Galaxy S III and Galaxy Note II. The app will also run on the forthcoming HTC One and Samsung Galaxy S4 “and on more devices in the coming months.”

Three analysts predicted that other Android smartphone makers will support Google Play downloads of the free Home app starting April 12.

Facebook CEO Mark Zuckerberg noted during Home’s unveiling today that a popular smartphone might only sell 10 million units in the early days of sales.

But there are many times more Facebook users, and Facebook is by far the most often-used application a smartphone, he added. “We spend 20% of the time on phones on Facebook,” he noted.

Device makers other than HTC or Samsung will likely support Facebook Home “out of competitive pressure,” said Patrick Moorhead an analyst at Moor Insights and Strategy. “I believe Facebook is paying the carrier and the handset provider, too, because it does involve more work and support for everyone. Other Android makers will only want a native Facebook Home if they are being paid by the carrier or Facebook.”

At the same time, the handset maker and carrier will lose control of the user, “because your primary experience will be Facebook, not Samsung or HTC,” Moorhead said.

“I am sure more handset makers will come to support Home,” said Carolina Milanesi, an analyst at Gartner. “I cannot believe they would limit themselves.”

She said she couldn’t confirm if HTC First will exclusively have Home preloaded for some time or if more will come.

Jack Gold, an analyst at J. Gold Associates, said he didn’t feel that the native Home on HTC First will do well, and might only have a 25% chance of long-term success.

“How many users want a hostile takeover of their phone?” he said. “How many people want a Facebook phone?”

Even with 1 billion Facebook users, Gold said there probably aren’t enough frequent users that would want the Home experience to make it a great success.

All the new and recent Android manufacturers and devices will eventually support a Home app download, Gold said, but whether another native Home device is built will depend on the success of the HTC First phone.

Several analysts said that if users don’t like the downloaded Home app, they will be able to remove it. Gold and Moorhead said it would be hard to remove the Home launcher from the HTC First, however.

“If you bought the HTC First knowing about Home, why would you want to remove it?” Gold asked.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
Dell-XPS-10-Tablet

Dell to release new Windows tablets later this year

by Soloiston 4 April 2013in Software News No comment

The company is exploring designs with screens of 10 inches and larger.

Dell will release Windows tablets later this year that could potentially include devices with screen sizes larger than 10 inches.

The products will be a refresh of Dell’s current tablet offerings, said Steve Lalla, vice president and general manager of mobile products and solutions at Dell, in an interview.

Dell is exploring designs with different screen sizes to its current 10-inch models, he said, though it was unclear if those new sizes will be among the products released later this year. Dell is primarily interested in screen sizes of 10 inches or larger, Lalla said.

The new tablets will succeed the XPS 10, which runs Windows RT, and the Latitude 10, which runs Windows 8 Pro. Dell continues to work on Windows 8 and Windows RT devices, Lalla said, but he didn’t give further details about the upcoming products or their exact release dates.

The company also plans to release thinner and lighter laptops and convertibles later this year.

Dell’s tablets are geared toward the BYOD (bring-your-own-device) market, meaning they’re designed to be suitable for use both at work and at home. The company aims to deliver better remote management, cloud and encryption features in its future products.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
Larry_ellison_500_medium

Oracle brings data center fabric to Sparc systems

by Soloiston 4 April 2013in Software News No comment

Sparc T5, T4 and M5 servers will work with the high-speed interconnect technology.

Oracle has extended its data center fabric to its Sparc-based Unix platforms, promising to let enterprises tie more servers and applications into the high-speed infrastructure.

The fabric technology, which Oracle acquired in its purchase of startup Xsigo Systems last year, connects servers and storage over Ethernet and Infiniband and allows for thousands of virtual network interfaces. That saves IT departments from having to install multiple network interface cards and host bus adapters in its physical servers, while tying together the resources in the data center at speeds up to 80Gbps (bits per second).

The addition of Unix support is the first change Oracle has made to Xsigo’s technology since the acquisition, apart from rebranding it as Oracle Virtual Networking, said Charlie Boyle, senior director of marketing for Oracle’s data center division. The company added Oracle Virtual Networking support to its Sparc T5, T4 and M5 servers and for the Oracle Solaris 11 OS on both Sparc and x86 hardware. Connecting Unix servers to the fabric will give users, as well as other servers, faster access to the critical applications that often run on those platforms, he said.

Oracle expanded the Sparc-based T and M server lines last week with the T5 and the M5-32, both of which are based on new processors. It’s the first time the company has built M-class servers based on its own chips.

Oracle Virtual Networking is designed to deliver the benefits of software-defined networking (SDN), including rapid application provisioning, detailed quality-of-service controls and simplified movement of virtual machines from one physical server to another. It’s built around the Oracle Fabric Interconnect hardware platform, which provides the high-speed connectivity. The company claims Oracle Virtual Networking can boost application performance by four times while cutting LAN and SAN capital expenses in half.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
icannlogo

Groups say ICANN unprepared for gTLD launch

by Soloiston 3 April 2013in Software News No comment

The swift gTLD rollout could endanger the stability of the DNS Root Zone, Verisign said.

TheA delegation of new generic top-level domains (gTLDs) by the Internet Corporation for Assigned Names and Numbers (ICANN)A is premature and could cause risks to the security and stability of the Domain Name System (DNS) and affect the working of the whole Internet, Verisign has warned.

As ICANN pushes for an April 23 launch of the first new gTLDs, Verisign has raised concerns in a report outlining new gTLD security and stability issues, sent to ICANN and filed with the U.S. Securities and Exchange Commission (SEC) last week.

The risks named in the report should be addressed in a timely manner by ICANN, otherwise the broader implications of new gTLDs to parties that rely on the Internet DNS will be “far-reaching,” Verisign said.

Security concerns from Verisign and other organizations indicate ICANN may be headed for a “train wreck,” said the Association of National Advertisers (ANA), a trade group.

ICANN is in the process of evaluating applications for new gTLDs like .sport and .news. The first 27 gTLDs have already passed the initial evaluation phase, with the Japanese words for Amazon, store and fashion among the first to pass.

But the process is going too fast, according to Verisign, which applied for the transliteration of dot.com in Chinese. Verisign’s application has also passed the initial evaluation.

“In order to ensure a successful implementation of each new gTLD, it is essential that proper planning be conducted in advance,” Verisign said in the report. This preparation should entail the development of a project plan for each new gTLD to be implemented, it added.

“These plans should align with ICANN’s timelines, thus minimizing impacts to current registry operations, as well as the overall DNS and broader Internet ecosystem,” Verisign said. It called on the ICANN board to address the issues appropriately before delegating any new gTLDs, “as the risk of a misstep in this direction could have far-reaching and long-lasting residual implications.”

The Verisign report, coupled with a March 15 letter from PayPal also raising security concerns, demonstrate a need for ICANN to slow down, said Dan Jaffe, executive vice president of government relations for the ANA.

While the trade group has objected to the new gTLDs because of trademark concerns, the Verisign and PayPal security concerns may indicate even more serious issues, Jaffe said Tuesday. “It would be reckless to move forward until these problems are resolved,” he said.

ICANN said it takes the security issues raised by Verisign “very seriously,” but the issues are addressed.

“Security of the DNS has always been paramount for ICANN,” ICANN CEO Fadi ChehadA(c) said in a statement. “Every issue raised by Verisign in this report has been discussed within the ICANN community during the development of the new gTLD program over the past eight years. The program is operationally on track and I anticipate no delays.”

But Verisign said that rolling out multitudes of new gTLDs could cause problems for the DNS Root Zone, the highest level of the Domain Name System (DNS) structure, which contains the numeric IP addresses for all top-level domain names such as gTLDs like .com and .org as well as country code top-level domains like .us and .uk.

“Without a well constructed and well reasoned process model, and at the scale of changes foreseen with the addition of the unprecedented rate of the new gTLDs being added, the entire DNS hierarchy faces the potential for issues at or near the root of the DNS tree, and the fallout from such a change could affect all delegations,” Verisign said.

 

ICANN seems to have taken a very “ICANN-centric role” with the rollout of new gTLDs and has given little consideration for registry operators that will need to prepare for the changes, including dealing with security implications, Verisign said.

“It actually appears as though there is little to no time allotted for operators to adequately prepare,” Verisign said.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
Network_Administrator_Modified-WEB

Network administrators look to SDN with hope

by Soloiston 3 April 2013in Software News No comment

Enterprise and carrier executives see more control, faster provisioning in the still-developing software-defined networking technology.

Some network operators say they need new tools to set up and manage connections in a virtualized world, even if that means adopting software-defined networking technology, which is still in its infancy.

Server and storage virtualization has freed computing and data from the confines of boxes in fixed locations, letting IT handle resources more efficiently. But networks still require manual configuration to keep those resources properly linked. SDN is designed to extend virtualization to networks, too.

That’s an idea whose time has come, according to executives from two carriers and a large enterprise that are planning to test the Virtualized Services Platform, the SDN system introduced on Tuesday by Alcatel-Lucent venture Nuage Networks. They spoke on a panel at the company’s launch event in Santa Clara, California.

The provisioning of new virtual machines at the University of Pittsburgh Medical Center routinely kicks off lengthy discussions among IT staff about setting up the necessary connections and privileges for those VMs, said Bill Hanna, UPMD’s vice president of technical services. With 3,500 VMs and growing, that’s not a good thing to hear, he said.

“Today, the frustration is really with the virtualization folks, because the networks really do lag,” Hanna said. The problem is the network. “The architecture … does not lend itself to virtualization.”

When the IT department moves VMs from one physical server to another, it changes the patterns of data traffic, sometimes dramatically, Hanna said. Network administrators want a view into those moves and a way to allocate the right amount of bandwidth to each VM wherever it is, he said.

Telus, a wireline carrier that serves consumers and businesses across Canada, can now automatically provision a broadband connection to a home but not to an enterprise data center, said Walter Miron, a director of technology strategy at Telus. The company hopes SDN can help bring that speed to its business service rollouts and is evaluating various SDN systems, Miron said.

The carrier operates many data centers itself and wants to be able to manage them in conjunction with the network, he said.

“Our philosophy is that the data center is part of the network, not adjacent to the network,” Miron said.

French service provider SFR is going up against Amazon Web Services with a cloud computing service operated by Numergy, a venture it established with the French government and other partners. With traditional data center networks, the company won’t be able to deliver the service levels that its enterprise customers will demand, said Pierre Barnabe, director general of SFR Business Team, the carrier’s enterprise division.

Numergy also wants to offer seven different levels of security. “For that, we need SDN,” Barnabe said.

Hanna said the fact that VSP is based on technology from Alcatel’s Service Router Operating System, which UPMC already uses in about 100 routers. Still, the network executives know they are delving into largely uncharted territory.

“This is the biggest change in networking in years,” said Hanna, who has been working in the field since the mid-1980s.

SDN won’t fulfill all the promises that are being made for it today, but failures and shortcomings are a part of progress, SFR’s Barnabe said in an interview after the panel discussion. He compared the various flavors of SDN to the different cellular technologies that are now converging in LTE.

“We need to test a lot of things,” Barnabe said. “We need to push innovation.”

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
glasscullet4

Trade groups look for uses for recycled CRT glass

by Soloiston 2 April 2013in Software News No comment

The two groups offer a $10,000 prize for the best ideas

An electronics and a recycling trade group are looking for ways to reuse recycled cathode ray tube (CRT) glass from computer monitors and television sets, with a US$10,000 prize for the best proposal.

The Consumer Electronics Association (CEA) and the Institute of Scrap Recycling Industries (ISRI) launched the CRT Challenge Monday, with the two groups looking for financially viable, environmentally conscious proposals for using recycled CRT glass. The challenge is hosted on crowd-sourced incentive site Innocentive.com.

CRT technology has been replaced in the monitor market by liquid crystal displays (LCDs), light-emitting diodes (LEDs) and plasma displays, but the trade groups expect more than 2 billion pounds of legacy CRT TVs and monitors to enter the recycling stream in the coming years.

CEA and ISRI will accept submissions for the CRT Challenge until June 30. The groups will pick the winning proposal based on economic and environmental benefits, and CEA will award $10,000 to the winner. CEA and ISRI will publicize and share proposals with manufacturers, retailers and recyclers.

CRTs were widely used in displays, including TV sets, computer screens and diagnostic equipment, for many years. Because new CRT displays were the primary destination for recovered CRT glass, the end-use markets for CRT glass have decreased considerably, the trade groups said.

CEA issued its first CRT Challenge in 2011.

The trade group named three winners: Mario Rosato, who proposed a closed-loop process for separating the lead from the glass in a form with high market value; Nulife Glass Processing, which proposed a process that uses an extremely energy-efficient electrically heated furnace, uniquely designed to produce minimal emissions; and Robert Kirby, who submitted an idea for combining CRT glass with cement to create tile and bricks that are tested, labeled and sold specifically for applications where lead shielding is required, such as X-ray and fluoroscopy rooms.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
amazon_cloud_drive_1223946_g1

Amazon.com upgrades Cloud Drive with file syncing

by Soloiston 2 April 2013in Software News No comment

Amazon’s cloud storage service adds a feature included in Dropbox and Google’s Drive.

Amazon.com has added a file-syncing feature to its online storage product, Cloud Drive, putting the service on par with competitors such as Dropbox and Google’s Drive.

The syncing feature will allow users to view an up-to-date file across several devices. Cloud Drive is a desktop application for Windows and Mac, and Amazon.com also has a version designed for Android, just for photos. Amazon.com also offers a Web-based upload panel if users don’t want to download the desktop application.

Online storage applications are a very competitive market segment. The services allow users to spread the same version of a file across several computers and prevent the loss of a file in case a computer is lost or stolen or breaks. But it is also an area where technical glitches and performance issues are a concern.

Late last month, Google Drive suffered three service problems that prevented some users from accessing their files and applications. One of those outages lasted three hours and affected about a third of requests to the service. Effects included error messages, long load times and timeouts.

Dropbox, which uses Amazon’s Web Services for its infrastructure, had problems with syncing and uploading files in January.

Amazon.com’s Cloud Drive desktop application is compatible with Windows XP, Vista, 7, and 8, and Mac OS X versions 10.6 through 10.8. Cloud Drive offers 5GB of free storage, with annual subscription packages going up to 1,000GB for US$500.

The company has incorporated Cloud Drive into its Kindle Fire tablet. Photos uploaded to Cloud Drive will appear in the Kindle Fire’s Photo library as well as the Cloud Drive Photos application on an Android device. Photos that are uploaded from an Android device are also copied into the Cloud Drive folder on a desktop computer.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
sapman

SAP sues to protect customers from patent suits

by Soloiston 1 April 2013in Software News No comment

The vendor is seeking a declaratory judgment against patent holder Pi-Net International.

SAP has filed a court action against patent holder Pi-Net International, which it says has filed patent infringement lawsuits against a number of SAP customers.

The alleged infringements, which concern three Pi-Net patents, are associated with SAP’s Financial Fusion software, which is used for online banking and other purposes. SAP gained the software through its 2010 acquisition of Sybase.

At least one SAP customer has asked the vendor to indemnify it against any liabilities that could ensue from Pi-Net’s lawsuits, according to SAP’s filing last week in the U.S. District Court for the Northern District of California.

SAP is seeking a declaratory judgment stating that its products don’t infringe on Pi-Net’s patents, a move that would shield both itself and customers from litigation.

The three patents in question cover “multimedia transaction services,” “web application network portal” and “value-added network system for enabling real-time, bi-directional transactions on a network,” according to SAP’s suit.

Some evidence suggests Pi-Net is a so-called non-practicing entity, placing more effort on enforcing its intellectual property rights in search of licensing fees and damages, than creating products for sale in the market.

Pi-Net does not appear to have a dedicated website, but a phone number listed for the company is also associated with the Menlo Park, California, firm WebXchange.

The latter company’s website contains a description for a product called Transweb that “intelligently routes, switches, tracks, and manages value-added Internet transactions.”

WebXchange’s software was beta-tested at Cisco and put into use at First Data, according to the site.

Another page describes WebXchange as an “aggressive startup” and lists job openings for software engineers.

However, it was not clear how current or accurate the information on the dated-looking site was on Friday, as a copyright notice on one page is labeled 2007.

In 2010, WebXchange and Pi-Net’s founder, Lakshmi Arunachalam, filed an amicus curiae brief in a patent case brought against Facebook by Leader Technologies. Such briefs allow third parties to weigh in about information relevant to a case.

The brief describes her as “the inventor of a portfolio of the earliest Internet patents that give control over any real-time web transaction from any web application.”

In addition, “these patents give her control over the internet cloud and any cloud application,” according to the brief, which also asserts that Pi-Net and WebXchange are “practicing entities with the earliest products implementing web applications based on her patents.”

“Arunachalam invests 100% of her time in research and development (R&D) and in the patenting of new internet-based products,” it adds. She did not respond to requests for comment on Friday.

Court records and news reports show that WebXchange has sued other tech vendors in recent years for patent infringement, including Dell and Microsoft.

Records also show that Pi-Net has filed lawsuits against a series of financial institutions within the past year or so, including Merrill Lynch and Bank of America. It wasn’t clear Friday whether these were among the SAP customers referred to in SAP’s filing, which were not named.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
large_DellStampede

Dell working on ARM supercomputer prototypes

by Soloiston 1 April 2013in Software News No comment

Dell believes an ARM supercomputer will be applicable in specific use cases.

Not fazed by a takeover battle looming on the sidelines, members of Dell’s research division are putting together the pieces for prototype ARM supercomputers that could be deployed in the future.

Dell has a good idea what an ARM supercomputer would look like, and prototype designs and other “parts” are being experimented with in Dell’s laboratories, said Tim Carroll, director at Dell’s research computing group.

“It is a solution right now looking for a problem,” Carroll said. “ARM is going to have a place. The market is going to tell us what that is.”

ARM processors go into most smartphones and tablets and are attracting interest for use in servers. The power-efficient CPUs could help cut energy consumed by servers in data centers while bringing enough processing power to handle fast-moving Web search or social-networking requests. Dell is already offering low- to midrange prototype ARM servers for customers to play with.

Depending on workloads, ARM processors could find limited use in supercomputers, Carroll said. ARM processors will deliver dollar savings per FLOP (floating point operations per second) per rack, and some institution will take a leap of faith and use ARM processors in a supercomputer, Carroll said.

Some of the world’s fastest supercomputers use x86 processors from Intel or Advanced Micro Devices, Power processors from IBM or Sparc processors from Oracle. ARM processors are currently not considered powerful enough for supercomputers, which are mostly used by research organizations running complex calculations.

The inability to pass a certain processing capability threshold is a handicap for ARM in supercomputing, but Carroll noted that the market can change swiftly, as witnessed by graphics processors, which are now a key co-processor alongside CPUs in supercomputers.

“Do not presuppose you understand all the different use cases that are out there,” Carroll said.

The use case for ARM processors has yet to be determined, but curious researchers will find answers, Carroll said. In that regard they will be ahead of the commercial sector, which has deployment cycles and deadlines to keep in mind, Carroll said.

The supercomputing market is also changing with the emergence of the cloud, which could influence the way systems are built, Carroll said. Complex calculations may be done in remote servers, with the cloud being the mechanism for the request and delivery of information.

“We are going to get there. Cloud as a transport mechanism to tie together all these big infrastructure implementations is going to have to happen,” Carroll said.

ARM processors are also inexpensive, especially when compared to FPGAs (field programmable gate arrays), Carroll said. FPGAs are reprogrammable circuits used in many supercomputers.

The Barcelona Supercomputing Center has been at the forefront of experimenting with ARM supercomputers. Last year BSC said it was making a prototype supercomputer with Samsung’s Exynos 5 dual-core processor, and in late 2011 it announced a supercomputer based on Nvidia’s Tegra 3 processors.

Chips for ARM servers are offered by Calxeda, Marvell and Texas Instruments. ARM processors right now are only 32- and 40-bit. But ARM has already announced its first 64-bit ARMv8 architecture and accompanying Cortex-A57 and Cortex-A53 processor designs based on the architecture. Advanced Micro Devices, AppliedMicro and others are expected to offer integrated ARM chips for servers.

But it could take a while for ARM to be accepted by the research community, Carroll said. Software written today is still not being targeted at ARM servers, and researchers tend to hold on to old code, Carroll said. ARM, x86 and Power processors run on different instruction sets and support different code bases.

Dell today builds servers with x86 processors. A supercomputer based on Dell’s blade design was deployed last year at the Texas Advanced Supercomputing Center, which is based at the University of Texas, Austin. Called Stampede and rated the world’s seventh-fastest supercomputer, the machine delivers a peak performance of 10 petaflops. The world’s fastest supercomputer at the U.S. Department of Energy’s Oak Ridge National Laboratory, Titan, delivers peak performance of 20 petaflops.

The Stampede supercomputer has a total of 102,400 processor cores which include Intel’s Xeon E5 CPU, Nvidia graphics processors and Intel’s Xeon Phi co-processor. The 182-rack supercomputer has 270TB of RAM, 14 petabytes of storage, occupies 11,000 square feet of space, employs 75 miles of network cables and draws 3 megawatts of power.

Dell isn’t primarily viewed as a supercomputing vendor, but Carroll wants to meet the needs of customers regardless of processor architecture.

“We’re getting better and better,” Carroll said.

 

for more information where Total Soft Tech Solutions Inc. gets its news.

visit link below:

www.computerworld.com

Continue Reading
  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • »

News

  • Games and diversions dominate mobile apps use
  • Will more smartphones support Facebook Home?
  • Dell to release new Windows tablets later this year
  • Oracle brings data center fabric to Sparc systems
  • Groups say ICANN unprepared for gTLD launch

About Us

  • Careers
  • Management Team
  • Management Set Up
  • Who We Are
  • Words from the CEO

© Copyright 2015 - Total Softtech Solutions Incorporated.