f



Re: problem with large sas data sets #7

Not a thing wrong with that analogy...  ;-)

jim

Yes, i grew up on a farm, and do have manners.. I think...  ;-)

Michael Raithel wrote:
> Dear SAS-L-ers,
>
> Shiping posted the following:
>
>> Hi, sometimes I have a problem to use unix command to copy,
>> mv or soft link large sas data set(over 4-5 GB). After I do
>> that, I cann't open that data anymore . Sas complain that
>> ERROR: The open failed because library member
>> TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar
>> experience?
>>
> Shiping, yes, indeed, I have had this error on UNIX and on Linux servers
> in the past when copying very large SAS data sets between directories on
> the same server.  It was always due to not having enough disk space in
> the target directory, so the resultant SAS data set was kindof lopped
> off at the end.  You know; you can't put 100 pounds of feed into a
> 75-pound feedbag and expect to be able to carry _ALL_ of your feed to
> your livestock.  (<--Note the dime-store pseudo-intellectual analogy)!
>
> So, if I were in your sandals, I would check the disk space in my target
> directory (e.g. df -k)and double-verify that I had enough space in it
> before trying to copy my HUMONGO SAS data set to it.
>
> Shiping, best of luck in shipping your SAS data sets between directories
> on your UNIX server!
>
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Michael A. Raithel
> "The man who wrote the book on performance"
> E-mail: MichaelRaithel@westat.com
>
> Author: Tuning SAS Applications in the MVS Environment
>
> Author: Tuning SAS Applications in the OS/390 and z/OS Environments,
> Second Edition
> http://www.sas.com/apps/pubscat/bookdetails.jsp?catid=1&pc=58172
>
> Author: The Complete Guide to SAS Indexes
> http://www.sas.com/apps/pubscat/bookdetails.jsp?catid=1&pc=60409
>
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
> Anything that has real and lasting value is always a gift from within. -
> Franz Kafka
> +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
>

--

"Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 servers, 2
Windows 2003 servers, 1 MySQL Database Server, 1 Postgres Database
Server, 1 Linux server, several Ubuntus and a direct satellite feed to
my windows desktop background, who needs toys???"  -  Jim
0
agnew614 (52)
4/24/2008 1:45:15 PM
comp.soft-sys.sas 142828 articles. 3 followers. Post Follow

0 Replies
463 Views

Similar Articles

[PageSpeed] 58

Reply:

Similar Artilces:

Re: How to filter sas data sets into separate sas data sets #3
Lizette, Instead of trying to merge the two data sets, I would probably try to create a SAS format from the values of VAR1 in data set 1. Then, NODE1, NODE2 and NODE3 could be compared against the format for a match. The example below is a simplified version of what you could do and shows a printout of how it works. It has 5 observations in data set 1 and only 3 variables in data set 2, but I think the logic should hold for the example you gave. After the example is code that could be used to actually split the data as you had requested. Hope this helps. * create sas data set 1 ; data ...

Re: How to filter sas data sets into separate sas data sets #4
Something like this is the old way. You could use a proc sql if you have a new enough version. Increase your buffersize and if you have enough memory you may get it into a hash routine. DATA WORK.NEW; MERGE small (IN=A OBS=500) big ; BY ID_FIELD; IF A=1; RUN; QUIT; RICH -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@listserv.vt.edu] On Behalf Of Lizette Koehler Sent: Monday, April 02, 2007 10:53 AM To: SAS-L@LISTSERV.VT.EDU Subject: How to filter sas data sets into separate sas data sets Listers, This is my failing point in coding SAS. T...

Re: problem with large sas data sets
hhmm.. are you crossing operating systems, like a windows or vms inbetween unix hosts? is there FTP involved anywhere and it's sending binary as ascii? is the network link bad? sas_9264 wrote: > Hi, sometimes I have a problem to use unix command to copy, mv or soft > link large sas data set(over 4-5 GB). After I do that, I cann't open > that data anymore . Sas complain that ERROR: The open failed because > library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has > similar experience? > > Thanks, > > Shiping > -- "Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 servers, 2 Windows 2003 servers, 1 MySQL Database Server, 1 Postgres Database Server, 1 Linux server, several Ubuntus and a direct satellite feed to my windows desktop background, who needs toys???" - Jim ...

Re: problem with large sas data sets #5
If I/O as source of corruption is the criterion, then practically all applications are "inherently untrustworthy". True, such risk would have to increase as amount of I/O increases, but there are means of reducing that risk - i.e. contemporary analogs of the parity track in 9-track tapes. It seems to me that the issue is one of managing risk. For example, if a binary compare utility, suggested by RolandRB, declare two files equivalent, I'd like to know how running a PROC COMPARE further reduces risk. Regards, Mark -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of RolandRB Sent: Tuesday, April 22, 2008 3:22 AM To: SAS-L@LISTSERV.UGA.EDU Subject: Re: problem with large sas data sets On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote: > Hi, sometimes I have a problem to use unix command to copy, mv or soft > link large sas data set(over 4-5 GB). After I do that, I cann't open > that data anymore . Sas complain that ERROR: The open failed because > library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has > similar experience? > > Thanks, > > Shiping There was a post on this newsgroup (list) about a year ago of a huge dataset having a few corrupted records after doing a copy. Unfortunately, you should expect this. After copying a huge dataset, and somehow being sure it has flushed from the cache, you should use a utility to do a comparison or better, use proc ...

Re: problem with large sas data sets #4
I don't accept it is reasonable to expect data corruption to occur without cause, even if the datasets are many or large...? At least your offsite backup should always be secure; your overnight and intraday backup should resolve short term faults, but there should always be a known reason why dataset x has gone kablooey. Rgds. On Tue, 22 Apr 2008 00:21:30 -0700, RolandRB <rolandberry@HOTMAIL.COM> wrote: >On Apr 21, 4:51 pm, sas_9264 <Shiping9...@gmail.com> wrote: >> Hi, sometimes I have a problem to use unix command to copy, mv or soft >> link large sas data set(over 4-5 GB). After I do that, I cann't open >> that data anymore . Sas complain that ERROR: The open failed because >> library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has >> similar experience? >> >> Thanks, >> >> Shiping > >There was a post on this newsgroup (list) about a year ago of a huge >dataset having a few corrupted records after doing a copy. >Unfortunately, you should expect this. After copying a huge dataset, >and somehow being sure it has flushed from the cache, you should use a >utility to do a comparison or better, use proc compare to make sure >you made a good copy. Do a few million shuffles of chunks of data and >one or two might well not work. It's one of the reasons I state that a >"validated" sas reporting system can never be truly validated. There >are too man...

Re: problem with large sas data sets #6
Dear SAS-L-ers, Shiping posted the following: > Hi, sometimes I have a problem to use unix command to copy, > mv or soft link large sas data set(over 4-5 GB). After I do > that, I cann't open that data anymore . Sas complain that > ERROR: The open failed because library member > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar > experience? > Shiping, yes, indeed, I have had this error on UNIX and on Linux servers in the past when copying very large SAS data sets between directories on the same server. It was always due to not having enough disk space in the target directory, so the resultant SAS data set was kindof lopped off at the end. You know; you can't put 100 pounds of feed into a 75-pound feedbag and expect to be able to carry _ALL_ of your feed to your livestock. (<--Note the dime-store pseudo-intellectual analogy)! So, if I were in your sandals, I would check the disk space in my target directory (e.g. df -k)and double-verify that I had enough space in it before trying to copy my HUMONGO SAS data set to it. Shiping, best of luck in shipping your SAS data sets between directories on your UNIX server! +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Michael A. Raithel "The man who wrote the book on performance" E-mail: MichaelRaithel@westat.com Author: Tuning SAS Applications in the MVS Environment Author: Tuning SAS Applications in the OS/390 and z/OS Environments, Second Edition http://www.sa...

Re: problem with large sas data sets #8
Roland: You said: Those are maybe just the errors you know about. Do a "proc compare" after copying one of these humungous datasets, and somehow making sure it is no longer sitting in the disk cache, and you might find more errors. This comment causes me to repeat (and reword) my unanswered question to you earlier in this thread. Why would you do a PROC COMPARE after a system copy, instead of a system compare utility? The latter is presumably faster (i.e. no need to deconstruct blocks into records, etc.) and just as good (maybe better) at telling you if there is a difference in a single bit. Regards, Mark -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of RolandRB Sent: Thursday, April 24, 2008 12:27 PM To: SAS-L@LISTSERV.UGA.EDU Subject: Re: problem with large sas data sets On Apr 24, 2:37 pm, michaelrait...@WESTAT.COM (Michael Raithel) wrote: > Dear SAS-L-ers, > > Shiping posted the following: > > > Hi, sometimes I have a problem to use unix command to copy, > > mv or soft link large sas data set(over 4-5 GB). After I do > > that, I cann't open that data anymore . Sas complain that > > ERROR: The open failed because library member > > TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar > > experience? > > Shiping, yes, indeed, I have had this error on UNIX and on Linux servers > in the past when copying very large SAS data sets...

Re: problem with large sas data sets #3
ok... If we are shipping data between 2 systems, there could be an "alien" system such as a windows, vms, amiga, or whatever between that may not be transmitting the data correctly, If the system is only shuffling data between 2 disk drives on the same CPU, then it could either be a hardware failure of some type, or maybe a write-behind cache could not be flushing completely before the program wants the data again. If I move huge amounts of data, i can see the drive hammering (by the lights) for minutes after the move complete, as the write-behind cache empties... j. Shiping Wang wrote: > Hi Jim > > On 4/21/08, *Jim Agnew* <agnew@vcu.edu <mailto:agnew@vcu.edu>> wrote: > > But, is there a "different" system between the 2 unixes? > > > I know that we use solaris > > > Or, is the SAS dataset being copied to another drive within the same > system? > > > That's possible. Could that cause a problem for data damage? > > j. > > Shiping Wang wrote: > > I use Sas under unix system. > > On 4/21/08, *Jim Agnew* <agnew@vcu.edu <mailto:agnew@vcu.edu> > <mailto:agnew@vcu.edu <mailto:agnew@vcu.edu>>> wrote: > > hhmm.. are you crossing operating systems, like a windows or vms > inbetween unix hosts? is there FTP involved anywhere and it's > sending binary as ascii? ...

Re: problem with large sas data sets #2
But, is there a "different" system between the 2 unixes? Or, is the SAS dataset being copied to another drive within the same system? j. Shiping Wang wrote: > I use Sas under unix system. > > On 4/21/08, *Jim Agnew* <agnew@vcu.edu <mailto:agnew@vcu.edu>> wrote: > > hhmm.. are you crossing operating systems, like a windows or vms > inbetween unix hosts? is there FTP involved anywhere and it's > sending binary as ascii? is the network link bad? > > sas_9264 wrote: > > Hi, sometimes I have a problem to use unix command to copy, mv > or soft > link large sas data set(over 4-5 GB). After I do that, I cann't open > that data anymore . Sas complain that ERROR: The open failed because > library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has > similar experience? > > Thanks, > > Shiping > > > -- > > "Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 > servers, 2 Windows 2003 servers, 1 MySQL Database Server, 1 Postgres > Database Server, 1 Linux server, several Ubuntus and a direct > satellite feed to my windows desktop background, who needs toys???" > - Jim > > -- "Games? Solitaire? I have a 2-node VAXcluster, 3 Windows 2000 servers, 2 Windows 2003 servers, 1 MySQL Database Server, 1 Postgres Database Server, 1 Li...

Re: How to filter sas data sets into separate sas data s ets
I think that both Ron's (as he mentioned) and Richard solutions require that VAR1 is in both datasets. But from the original post, it seemed to me that VAR1 is only in data set 1, and it must be matched to 1 of 3 variables in data set 2 (NODE1, NODE2 or NODE3) to be output to the NEWLIST data set. For this reason, I think a format is one possible approach. Maybe the original poster can clarify this point. Thanks. Jack Clark Research Analyst Center for Health Program Development and Management University of Maryland, Baltimore County -----Original Message----- From: SAS(r) Discussio...

Re: Compressing data sets (was Re: [SAS-L]) #2 #7
I'm crazy..yes, SAS/Access...not Connect... ________________________________ Bruce A. Johnson bjohnson@solucient.com -----Original Message----- From: Jack Hamilton [mailto:JackHamilton@firsthealth.com] Sent: Tuesday, January 06, 2004 1:22 PM To: SAS-L@LISTSERV.UGA.EDU; Bruce Johnson Subject: Re: [SAS-L] Compressing data sets (was Re: [SAS-L]) A new way to read/write Excel files in 9.1 was mentioned at SUGI, but as I recall it required a server running SAS/Access to PC File Formats. It wasn't SAS/Connect doing the reading and writing, just as SAS/Connect isn't doing the remote...

Re: Deleting SAS Data from a SAS DATASET #7
A view helps on deletes, but I wonder how it affects performance of = querying the data- wouldn't storing the data in 24 different locations = cause a significant slowdown in perfomance upon querying the data versus = having it all in one table that is indexed? If this data is queryied a = lot but only deleted once a month, the time in querying (which probably = is in peak time) could be much more important than the time in deleting = (which could be run when the computer is not busy, such as nights or = weekends). =20 -Mary ----- Original Message -----=20 From: ./ ADD NAME=3DData _null_,=20 To: SAS-L@LISTSERV.UGA.EDU=20 Sent: Friday, August 15, 2008 3:51 PM Subject: Re: Deleting SAS Data from a SAS DATASET Summary: PROC DATASETS; AGE statement. + VIEWs This won't help you delete data from your very big data set, but you may find this example interesting. You say you append data monthly to a big data set then when big gets too big you need to clean out the old. And that takes a very long time. However if you don't physically append but use a view to append/combine you may find it easier to get rid of the unwanted old data. Consider this code. it pushes MonthlyUpdate onto the stack of 24 data sets and the 24th data set is deleted. Then all the data sets get renamed to produce a new group of 24. You can see from the notes how the operation works. The data sets don't have to use a numbered range M01-M24 I did that...

Re: search SAS data set from SAS code
> From: Rose > Hi All, > Suppose I have a sas permanent data set which was created > early, I know > the library path but I couldn't remember in which sas program code I > created it. how can I search from so many sas program files in > different folders and find it. a problem familiar to all of us delayed-housekeeping folks. Libname Libref '<directory-specification>'; DATA LibRef.DataSetName; use your system utilities to search for the dir-spec of your libref. search: *.sas containing text: <dir-spec> once you have found the libname...

problem with large sas data sets
Hi, sometimes I have a problem to use unix command to copy, mv or soft link large sas data set(over 4-5 GB). After I do that, I cann't open that data anymore . Sas complain that ERROR: The open failed because library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has similar experience? Thanks, Shiping On Apr 21, 4:51=A0pm, sas_9264 <Shiping9...@gmail.com> wrote: > Hi, sometimes I have a problem to use unix command to copy, mv or soft > link large sas data set(over 4-5 GB). After I do that, I cann't open > that data anymore . Sas complain that ERROR: The open failed because > library member TEMP.XXXXXX_XX041608.DATA is damaged.Does anyone has > similar experience? > > Thanks, > > Shiping There was a post on this newsgroup (list) about a year ago of a huge dataset having a few corrupted records after doing a copy. Unfortunately, you should expect this. After copying a huge dataset, and somehow being sure it has flushed from the cache, you should use a utility to do a comparison or better, use proc compare to make sure you made a good copy. Do a few million shuffles of chunks of data and one or two might well not work. It's one of the reasons I state that a "validated" sas reporting system can never be truly validated. There are too many I/Os going on and sometimes these might fail. By its design of being a procedural language and creating many datasets in the process of running a complex job, it is inherently ...

Re: search SAS data set from SAS code #5
Rose, You have some good advice on search techniques, but they may beinadequate. I hope your LIBNAME wasn't something like libname lib "&dir" ; Perhaps you should also search for ".member", but that also couldhave the same problem. You might also look for key variablenames or values, or procedures that you know created the data.The date from a PROC CONTENTs might provide useful information,or an old report created by the same program with a footnote,"Source code: ...". Maybe data lib.w ( label="created by ..." ) ; would be a good habit to ...

Re: search SAS data set from SAS code #2
Rose, The answer to your question depends on your operating system. In Windows, there's the Search tool. In Unix/Linux, you can use grep Bob Abelson HGSI 240 314 4400 x1374 bob_abelson@hgsi.com Rose <myr_rose@YAHOO.COM> Sent by: "SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU> 04/19/2005 11:13 AM Please respond to myr_rose To: SAS-L@LISTSERV.UGA.EDU cc: Subject: search SAS data set from SAS code Hi All, Suppose I have a sas permanent data set which was created early, I know the library path but I couldn't remember in which s...

Re: Reading & Writing SAS data sets without SAS #3
Chang, You're correct in that a number of companies have done it. I believe SPSS can do it, WPS, Stat Transfer, dbmscopy, and perhaps others have also done it. But what I think is unique about this is that Alan is talking about offering plug-ins so you can roll-your-own so to speak. How cool would it be to have some type of driver/plugin for R? Phil Philip Rack MineQuest, LLC SAS & WPS Consulting and WPS Reseller Tel: (614) 457-3714 Web: www.MineQuest.com Blog: www.MineQuest.com/WordPress -----Original Message----- From: Chang Chung [mailto:chang_y_chung@HOTMAIL.COM] Sent: Monday...

Re: Reading SAS data sets on UNIX by non-SAS apps #2
John: Following on Richard's thoughtful suggestions, the Affinium system would likely capture data from csv files. SAS PROC EXPORT produces them quickly, and loading them into external systems works faster for relatively basic data structures and data formats, in my experience, than xml parsing. Sig -----Original Message----- From: SAS(r) Discussion [mailto:SAS-L@LISTSERV.UGA.EDU] On Behalf Of John Bentley Sent: Monday, October 18, 2004 10:10 AM To: SAS-L@LISTSERV.UGA.EDU Subject: Reading SAS data sets on UNIX by non-SAS apps I have SAS data sets on AIX that we need to read with Unica's Affinium campaign management software, also on a UNIX box. (Let's not get into why we didn't go with the SAS Solution.) SAS Institute doesn't supply an ODBC driver for the UNIX environment, and the Affinium implementors don't want to use the SAS SQL Library for C and or deal with APIs. Other that dumping the SAS data sets as flat files, can anyone suggest a solution? Thanks in advance for the help. ...

Re: Exporting a SAS data set to Text file on SAS unix #3
hi ... actually, what I posted earlier was too much code (sorry) this is enough (a bit more succinct) * variable names into a macro variable (tab separated); proc sql noprint; select name into :vars separated by '09'x from dictionary.columns where libname eq 'SASHELP' and memname eq 'CLASS' order varnum; quit; data _null_; file 'z:\class.txt' dsd dlm='09'x ; if _n_ eq 1 then put "&vars"; set sashelp.class; put (_all_) (:); run; -- Mike Zdeb U@Albany School of Public Health One University Place Rensselaer, New York 12144-3456 P/518-402...

Re: What r the data types in SAS? in Base SAS , and SAS SQL
> From: Amar Mundankar > Sent: Tuesday, July 21, 2009 8:10 AM > To: sas-l@uga.edu > Subject: What r the data types in SAS? in Base SAS , and SAS SQL > > Hi all, > What are the different data types in Base SAS and SAS SQL?? character, lengths from 1 to 32,000+ numeric: lengths from 2 to 8 dates are a subtype of numeric and are identified by their date, datetime, or time formats the new proc TSPL (Table Server Processing Language) supports ANSI data types: bigint, tinyint, etc. http://support.sas.com/documentation/cdl/en/tsag/30878/HTML/default/a003 065339.htm http://s...

Re: Creating an empty SAS data set #7
Yes, "where 1=2" is potentially really bad. SAS could realize that it's always false and behave accordingly, but on the other hand it might read every record in OLD. Shouldn't be hard to test if anyone cares, but given the better alternatives, I don't think I care. -- JackHamilton@FirstHealth.com Manager, Technical Development Metrics Department, First Health West Sacramento, California USA Coelum, non animum mutant, qui trans mare currunt. >>> "Chang Y. Chung" <chang_y_chung@HOTMAIL.COM> 12/28/04 10:47 PM >>> On Wed, 29 Dec 2004 ...

Re: Data set problems and basic understanding of SAS #2
On Wed, 13 Jul 2005 19:37:55 -0400, Robert Slotpole <rslotpole@COMCAST.NET> wrote: >I keep working through the tutorials but there are so many of them. I know >this is a verry basic question but just the same any and all help greatly >appreciated. OK. But posting essentially the same question multiple times under different subject headings isn't really going to accomplish much. >In my main table D6 I have 200 observations consisting of 40 tickers >repeated for each of 5 dates. I issue a proc summary by date and get 5 >observations (1 for each date) that I now wa...

Re: Data set problems and basic understanding of SAS part 2
On Wed, 13 Jul 2005 22:17:24 -0400, Robert Slotpole <rslotpole@COMCAST.NET> wrote: >In following up on my previous post I can create and merge the data: > >the following output shows what I mean: > > total_ total_ total_ > Obs DATE TICKER endshare dol_div sales purchases > > 1 2002-12-31 *$$$ 0.0 0.0 9617604 - 9617604 > 2 2002-12-31 ABT 5700.0 0.0 9617604 - 9617604 > 3 2002-12-31 AMAT 0.0 0.0 ...

Re: Matching a Large SAS Data Set to a Huge Sybase Table
I say bring Mohammed to the mountain--one of the 3's makes the most intuitive sense to me. For whatever that's worth. 8^) If you've got a sybase index on your sid field, you might try this: ibname dbor sybase user = &idnid. pass = &idnpw. server = &idnserver. ; proc sql ; create table outds as select distinct c.* , a.xid , a.cd from GBO.inds c INNER JOIN dbor.dbtbl(dbkey = sid) a on c.sid = a.sid ; quit ; The smaller your sas dset relative to the sybase table, the better that should work. HTH, -...

Web resources about - Re: problem with large sas data sets #7 - comp.soft-sys.sas

Is–ought problem - Wikipedia, the free encyclopedia
The is–ought problem in meta-ethics as articulated by Scottish philosopher and historian David Hume (1711–76) is that many writers make claims ...

SmartPot Problems: Marijuana And The Internet Of Things
What happens when you combine cannabis, whose laws are changing almost daily, with technologies that are constantly evolving as well?

Corona Gets Ready to Rumble With Adrien 'The Problem' Broner
... presenting sponsor of "Premier Boxing Champions," gets ready to rumble with its spot featuring professional boxers including Adrien "The Problem" ...

Sanders Wins Millennial Women Two-to-One; Clinton Would Have Millennial Problem in General Election
Just some of the information from a new USA Today poll ( source ) by Gaius Publius Buried in a USA Today polling story with the headline "Poll ...

No Manuel! The Government Is the Problem. Not The Exits.
I find it difficult to publicly refute Manuel Trajtenberg. He is a man of high integrity and has done wonderful things for the State of Israel. ...

Ford has solved a problem for owners of its biggest pickup trucks
For a lot of drivers, it's hard to imagine life before cameras on cars and trucks. Rear-, side, and forward cameras have made parallel parking ...

Reports of voting problems surface in Florida primary
Some voting machines in the Jacksonville area were down Tuesday morning, and some Orlando-area reportedly ran out of ballots

World's $78 trillion pension problem
Dreams of lengthy cruises and beach life may be just that, with 20 of the world’s biggest countries facing a $78 trillion pension shortfall, ...

Vizio says new TV certification program has 'serious problems'
One of the most popular TV companies not in the UHD Alliance is Vizio, and turns out, it has some pretty strong feelings about it.

Apple cites iPhone, Mac security problems in rebuttal to FBI demands
In a lengthy legal rebuttal to the U.S. government, Apple yesterday deployed an unusual defense that its devices are susceptible to attack to ...

Resources last updated: 3/17/2016 7:21:10 AM