Dell 7080 Micro RAM Puzzler Solved

When something appears to be good to be true, watch out! I keep poking around on my new Dell 7080 Micro and learned something interesting, but also mildly distressing, about the configuration I ordered. The essentials show up as the DRAM frequency in the lead-in graphic for this story. The rated speed for that module is 2933 MHz, the reported rate is 1463.2 (doubled = ~2926 MHz). Something is amiss, eh? A quick trip to the manual’s memory section means makes this Dell 7080 Micro RAM puzzler solved.

RTFM = Dell 7080 Micro RAM Puzzler Solved

When I did the math, I saw the reported memory clock rate was half the SODIMM module’s rated speed. Immediately I suspected something might be up with dual-channel vs. single-channel memory performance. Ack! I was right: the very first entry in the memory section of the 7080 Micro manual reads as follows:

Dell 7080 Micro RAM Puzzler Solved.manual-noteI’m now aware that doubling up on memory is the right way to go on this machine. Sigh.

Turns out that the clock rate information is what it should be, though. DDR memory like this fetches two chunks of data with each cycle cycle. Hence the term “double data rate,” or DDR.

Doubling Up Is Good, So That’s What I’m Doing

I just ordered a pair of 16 GB G.Skill RipJaws modules for the machine. They’re rated for DDR4 3200, which means they’ll loaf along at the 7080 Micro’s default clock rate of 2933 with ease. I’m guessing further that the winsat mem command will show an improvement from its current level of 15954 MB/sec (the P-5550 reports 32945 by comparison, and I’m expecting something more along those lines). We’ll see.

But wait! I still have Dell’s Review Unit of the 7080 Micro, so I just booted it up to run the winsat mem command there. It has two 16 GB memory modules. And sure enough, the results are significantly higher than those for the single module unit: 26323 MB/sec hits more than halfway between the single-module value of 15954 and the P-5550 value of 32945. That’s an improvement of just over 31 percent. I’m pretty sure that’s what I’ll see on my unit once it’s upgraded, too. (Same CPU, same motherboard, so why not?)

I’ll know for sure sometime later today, because Amazon says my order will arrive no later than 10 PM tonight. While I was at it, I also purchased a Samsung EVO 860 1 TB SATA SSD to park in the 7080 Micro’s currently vacant SATA slot. The RAM set me back about $115, and the drive $100, both of which I consider excellent prices.

Given that the 7080 Micro was already pretty fast, I’m guessing this will make it just a tad faster. I’ll report the results of my checks after upgrading the RAM and installing the new SSD later today. Stay tuned!

Note Added 2 Days Later

Amazon didn’t deliver the goods until today. Took me all of 5 minutes to unbutton the 7080 Micro case. Only tool needed was a Phillips Head screwdriver. The SATA drive mounted into its sleeve with no tools, but I had to undo the thumbscrew on the outside of the case and remove the fan assembly (3-spring-tensioned PH screws) to replace the old memory module with 2 new ones.

Now for the good news: new test results are astounding. When I just ran the winsat mem command on the 7080 Micro I purchased, the results came back at 34430. That number seems to fall in a range from 34430 to 34450 on multiple iterations.

Guess what? This 7080 Micro’s RAM speed is slightly FASTER (just over 4%) than the Precision Workstation 5550. Woo hoo! Turned out to be a good investment. In fact, the new memory numbers are just over twice as fast for dual-channel RAM as they were for single channel. By doubling up, I got a great performance boost.

The moral of the story is: if you buy a Dell Micro 7080 make sure to populate both SO-DIMM slots, so you can get the best performance out of however much RAM you give it to work with. It makes a BIG difference.

Facebooklinkedin
Facebooklinkedin

Hard Disks Remain Useful PC Storage Devices

Hmmm. I just read a disturbing story over at Gizmodo. Something of a rant from Sam Rutherford, it explains “Why I’m Finally Getting Rid of All My HDDs Forever.” I’ve been following his work for some time, and he usually has intelligent and useful things to say. This time, though, I’m opposed to his position. In fact, I still firmly believe that hard disks remain useful PC storage devices. Quick count: I have at least 10 of them here in my office, at capacities ranging from 1 TB to 8 TB.

Why Say: Hard Disks Remain Useful PC Storage Devices?

If I understand his complaint, Mr. Rutherford is giving up on HDDs (Hard Disk Drives) because several of them gave up on him recently. One failure cost him 2 TB of data, some of it precious. I say: Boo hoo!

The lead-in graphic for this story comes from my production PC running a freeware program named CrystalDiskInfo. (Note: grab the Standard Edition: the others have ads and bundleware). Notice the top of that display lists Windows drives C:, J:, K:, G:, D:, I:, F:, and H:. In fact, all of them show blue dots and the word “Good” as well. These elements provide rude measures of disk health for both HDDs and SSDs. Of the 8 drives shown, 3 are SSDs, 4 are HDDs, and 1 is a so-called hybrid HDD; all are healthy.

Mr. Rutherford could have used this tool. Or used others like it, of which there are many (see these Carl Chao and WindowsReport survey pieces, for example). Then, he would have known his problem HDDs were headed for trouble before they failed. Plus, he himself admits he erred in not backing up the drive whose failure caused data loss. I check all my drives monthly (both SSDs and HDDs) looking for signs of impending trouble, as part of routine maintenance.

Backup, Backup and More Backup

SSDs are not mechanical devices, so they don’t suffer mechanical failures. Over the 10 years or so I’ve owned SSDs (perhaps a couple of dozen by now) not one has ever failed on me. Over the 36 years I’ve owned HDDs, I’ve had half-a-dozen fail out of the hundreds I’ve used. But it’s inevitable that I will suffer an SSD failure sometime, even though I’ve yet to experience one personally. Why? Because all devices fail, given enough time and use.

Personally, I think HDDs still have a place in my storage hierarchy. I just bought 2 8 TB drives earlier this year, for about $165 each. That’s way cheaper storage than even the cheapest of SSDs on today’s market, and much more capacity in a single device than I’d want to purchase in solid state form. (Note: a 7.68 TB Samsung 870 QVO SSD costs $750 at Newegg right now. Thus it aims at those with more money than sense, or those with cash-generating workflows that can actually cover such costs.)

The real secret to protecting data is multiple backups. I bought those 8 TB drives to back up all my other drives, so they’re my second local line of defense. I also pay for 5 TB of online storage at OneDrive and DropBox and have two extra copies of production OSes, key files and archives in the cloud as well. I backup my production PCs daily, my test PCs weekly, and key bits and pieces to the cloud weekly as well). Basta!

Facebooklinkedin
Facebooklinkedin

Interesting Single-Builder SSD Benefits

Just read an absolutely fascinating story at Tom’s Hardware by Sean Webster. Entitled Not-So-Solid State: SSD Makers Swap Parts Without Telling Us, it’s worth a read. The main point it makes is that many builders of SSDs — most notably Adata and its XPG brand — build SSDs using parts from multiple makers. Their products do change over time because of availability of component parts such as controllers and flash memory chips. In the case the story lays out, a highly recommended drive suffered performance losses owing to replacement of better faster parts with newer slower ones. This leads me to understand there can be interesting single-builder SSD benefits .

Where Interesting Single-Builder SSD Benefits Come From

Samsung, chief among SSD makers, builds all of the parts that go onto its SSDs. Thus, it controls the mix of elements on those devices completely. When constituent parts change, the company always changes its model numbers so that buyers know there’s “something different” on board. Tom’s points to practices from WD, Kingston, Crucial and other makers to indicate that the majority do indeed change model numbers as constituent parts change, too. Thus, the most interesting single-builder SSD benefits clearly come from end-to-end supply chain control. Third-party builders don’t have that luxury, because they buy parts from multiple suppliers.

Where does all this leave me? In fact, I bought an Adata/XPG SSD for my Ventoy “Big Drive.” It’s a 256 GB SX8200 Pro model, the very item that Tom’s Hardware finds fault with in the afore-linked story. Good thing I only use this device for storing and occasionally loading Windows ISOs. It’s new enough that I’m sure it’s subject to the flaws that Tom’s uncovered. If I were using it as a boot or internal SSD I’d be irate. As it is, running it over USB 3.1 means I’d never come close to the theoretical maximum read/write rates anyway.

The Moral of the Story

Ironically, this XPG device is one of two non-Samsung NVMe devices I currently own. The other such device is a Toshiba that came pre-installed in a cheap-o purchase of a year-old Lenovo X380 Yoga laptop. I wasn’t expecting top-of-the-line components because I paid under 50% of the unit’s original MSRP. But from now on, I’m sticking with Samsung NVMe drives, so I can avoid performance dings from covert or undisclosed parts changes in the SSDs I buy and use.

Who knew this kind of thing might happen? I certainly didn’t and I’m grateful to Tom’s for calling it to the world’s attention. It will certainly guide my future NVMe SSD buying habits…

 

Facebooklinkedin
Facebooklinkedin

Beta Channel Gets New Feature Experience Pack 120.2212.1070.0

All righty then. MS is changing up the way it introduces new OS features without an out-and-out feature upgrade. At least, that’s how I interpret Bradon LeBlanc’s November 30 post to Windows Insider. Fortunately, Mary Jo Foley at ZDnet provides additional illumination, too. Her same-day story explains “the Feature Experience Pack is a collection of features that can be updated independently of the Windows 10 operating system.” Insiders running Build 19042.662 are eligible. Alas, not all PCs will be offered the relevant KB4578968 update. Even though Beta Channel gets new Feature Experience Pack 120.2212.1070.0, not everybody will see it.

Solution for Beta Channel Gets New Feature Experience Pack 120.2212.1070.0

I firmly believing that where there’s a will, there’s a way. In fact, I found a link to a CAB file to apply KB4578969. It shows up as Post#15 in a TenForums thread entitled KB4592784 Windows Feature Experience Pack 120.2212.1070.0 (20H2). I got the offer on my Surface Pro 3 Beta Channel/Release Preview Channel PC, but not on my Lenovo X380 Yoga. But by downloading the afore-linked CAB file and using DISM (details follow), I did get it installed on the latter PC.

That command is:

DISM /online /add-package /packagepath:<CABpath>

It worked on my X380 Yoga. Thus, it should also work on your qualifying test PCs. Just be sure to shift right click on the CAB file, and use the “Copy as Path” option from the resulting pop-up menu. Then, you too can paste it into a PowerShell or Command Prompt windows (admin level, of course).

Facebooklinkedin
Facebooklinkedin

DIY Desktops vs Prefab Still Favor DIY

I got started building PCs back in the mid-1990s when I hired a talented young man who worked in the PC parts department at Fry’s to come work with me. He showed me the ins and outs of system construction. Along the way I learned that careful parts selection could indeed deliver a faster, more capable system for less than the price of an OEM pre-fab desktop. That’s why, IMO, DIY desktops vs prefab still favor DIY, 25 years on.

Why Assert: DIY Desktops vs Prefab Still Favor DIY

As I write this item, it’s Cyber Monday. We’re in the market for another desktop here at Chez Tittel. As my son’s PC is getting older — i7-6700 and Z170 vintage, now 5 years old — it’s time to start planning a replacement. My findings show DIY still gets more than prefab, as I will illustrate.

Doing the DIY Thing

Given that major deals are available today, I decided to see what I could get for around $2K either pre-fab or DIY. I’ve already got a case and plenty of HDD storage, so what I need is a PC with a capable CPU, 32 GB RAM, a 1 TB NVMe SSD for boot/system drive, and a next-to-top-rung AMD or Nvidia graphics card. I found some motherboard/CPU bundles for about $550, memory for about $115, Nvidia 2070 $600, Samsung 980 Pro 1 TB $230,  Seasonic 650 Platinum PSU $130 for a total of $1,625. Even if  I price in the case (Antec P8 for $90) and an 8TB drive ($165) total pricing comes in at $1,880.

Looking at Prefab options

Looking around online at Newegg or amazon, with a $1,900 budget (I used a number range of $1,850 to $2,000 for my searches). I mostly came up with 16 GB RAM configurations, 4 to 8 core CPUs,  lower-end GPUs (e.g. Nvidia 1060 or 1070X), 512GB – 1 TB NVMe SSDs (at least 1 generation back from the Samsung 980 Pro), and 1 TB HDD storage. That’s quite a bit less oomph than the same DIY budget, as you’d expect. I did see some pretty amazing refurbished deals on one or two generations back (most Intel) CPUs and kit. It still looks like refurb is the way to go if you want to buy an OEM desktop, especially if it comes straight from the OEM with a like-new warranty (no sytem warranties on DIY systems, only component-level warranties apply).

As an example of a killer refurb deal, here’s an HP Z840 workstation with two Xeon 8-core CPUs, 256 GB DDR4 RAM (!), 1TB SSD + 1 TB HDD, and a Quadro K4000 professional graphics card for $1,750. Now that’s pretty tempting…

This bad boy comes with 16 cores and 256GB RAM. Zounds!
This bad boy comes with 16 cores and 256GB RAM. Zounds!

I’m still sold on DIY

When it’s all said and done, I guess I’m OCD enough that I like picking all my own parts, and putting my own systems together. I do think you get more for your money, but you also have to have the time, the patience and the knowledge to put things together and to troubleshoot and support them for yourself. I realize that puts me in a minority, but I can live with that.

 

Facebooklinkedin
Facebooklinkedin

Russinovich Showcases Monster Azure VMs

Trolling through Twitter yesterday I found a tweet from Azure CTO Mark Russinovich. I’ll quote the text verbatim “Like I mentioned, Notepad really screams on the Azure 24TB Mega Godzilla Beast VM.” Ultimately this thread leads to an Ignite presentation from October, 2020. Therein, Russinovich showcases monster Azure VMs.

When Russinovich Showcases Monster Azure VMs, What’s the Point?

From left (older) to right (newer), the lead-in graphic shows a historical retrospective what’s been “monster” for memory optimized servers over time. Itty-bitty boxes at far left started out with Intel and AMD Gen7 versions, with 512 GB and 768 GB of RAM respectively. Along came Godzilla after that, with 768 GB or RAM and more cores. Next came the Beast, with 4 TB RAM and 64 cores. After that: Beast V2 with 224 Cores and 12TB RAM. The current king of Azure monsters is  Mega-Godzilla-Beast. It has a whopping 448 cores and 24TB RAM. No wonder Notepad really screams. So does everything else, including huge in-memory SAP HANA workloads for which this VM is intended.

I took Russinovich’s “really screams” Notepad remark as tongue-in-cheek when I saw it. Viewing his Ignite video proves that point in spades. What’s fascinating, though, is that some of the highest-end Azure users are already pushing Microsoft for an even bigger monster. They’re ready to tackle even bigger and more demanding workloads than Mega-Godzilla-Beast can handle.

Who Needs Mega-Monster VMs?

This rampant upscaling of resources is no mere idle fancy. Indeed, there are large companies and organizations that need huge aggregations of compute, memory, storage and networking to handle certain specialized workloads.

This also gives me an insight into the ongoing and increasing allure of the cloud. Most datacenters simply couldn’t put the technologies together to create such mega-monster VMs for themselves. The only way place to find them is in the cloud. Further, the only way to afford them is to use them when you need them, and turn them off right way when the workload is done.

Amazing!

Facebooklinkedin
Facebooklinkedin

Busy Times for Windows 10 But…

Attentive readers will notice I haven’t posted much this week. This is deliberate. I’m taking most of the week off from blogging here at EdTittel.com. Consider this post fair warning: these are busy times for Windows 10 but yours truly is pausing for a few days to recharge his batteries and spend some time with the family.

Busy Times for Windows 10 But I’m Taking a Short Break

Over the past few weeks, I’ve worked extra hours more than normal. The Wiley Dummies custom publications group and ActualTech Media’s content machine have thrown a bunch of hurry-up projects my way. Frankly, I’ve been struggling to keep up with paying gigs. Not a bad problem to have in this time of pandemic and pandemonium. I guess I should be grateful! Good thing our US Thanksgiving holiday tomorrow will give me just the opportunity I need to voice my appreciation to our hunkered-down family crew here at Chez Tittel!

That’s not to say there hasn’t been plenty going on with Windows 10. Just this morning, I’ve seen juicy rumors about upcoming 10X features — including something fascinating called “Cloud PC” — at WinAero and WindowsLatest. We’ve also seen new releases into the Dev Channel (20262.1010, mostly just a servicing item) and Beta/Insider Preview Channels (19042.662, with oodles and scads of fixes and tweaks). Of any of the sites I follow WindowsLatest seems to be the most on top of bugs and gotchas in 20H2, and has been reporting them in some volume lately.

As for me, I’ll be back on the beat on Friday, November 27. Lord knows, I plan to have a surfeit of calories to work off from epic consumption of turkey, all the trimmings, and pumpkin pie. In the meantime for those readers who will also be on holiday tomorrow, I hope you enjoy yours as much as I plan to enjoy mine. For the rest of you working schmoes, I hope you’ll take pleasure as and when it comes your way. Best wishes to one and all, regardless.

–Ed–

Facebooklinkedin
Facebooklinkedin

20H2 Alters Alt+Tab Experience

OK, then: I get it. When you run Windows 10 20H2 the OS does something different when Edge is running. Thus, when I say “20H2 alters Alt+Tab experience,” I mean that it goes through all open Edge tabs as you keep repeating that key combination. This is a little disconcerting, but something I guess I can get used to.

Exactly How 20H2 Alters Alt+Tab Experience

Prior to 20H2 if you had three applications open, striking Alt+Tab once would take you from the current application to whichever is next in the Windows sequence of open apps. Strike it again to get the third app, and again to cycle back to the start.

In 20H2, if one of the open apps is Edge, and it has multiple tabs open,  things change. When you get to Edge you’ll transition from the first (or currently) open tab, to the next tab in sequence. This  continues until you’d cycle back to the first tab you visited in this sequence.  Whatever comes up next will be the next app in the Windows sequence, at which point things continue as always.

A Possible Alt+Tab Gotcha?

Mayank Parmar, of Windows Latest, reports that some 20H2 users may find the Alt+Tab sequence disarranged after they upgrade to this new version. He doesn’t say if it applies to upgrades only, or if clean installs qualify as well. Either way, the symptoms are that the order of apps (and tabs) is inconsistent. In addition, stopping the Alt+Tab sequence on App 2 in a 1-2-3-4 sequence may drop the user into App 3, instead of App 2 as users expect it to do.

I haven’t been able to replicate this error on any of my 20H2 machines. But if you visit Feedback Hub and search on “Alt+Tab 20H2” you’ll see the top three resulting problem reports all talk their way around this issue. MS claims this has been addressed in Beta and Release Preview channel versions already. It’s not yet clear when that fix will make it to Windows Update, but it should be “coming soon.” Stay tuned, and I’ll let you know when that happens.

Facebooklinkedin
Facebooklinkedin

Pluton Enacts Prego CPU Philosophy

Here’s a blast from the past. In 1984, jarred spaghetti sauce maker Prego immortalized the phrase “It’s in there!” for its products. (Note: the link is to a YouTube copy of that very same TV advertisement.) But the tag line lives on, and comes with occasionally interesting applications. It helped me understand that Microsoft’s introduction of Pluton enacts Prego CPU philosophy.

What in Heck Does “Pluton Enacts Prego CPU Philosophy” Mean?

It means that functions currently associated with a separate chip called the “Trusted Platform Module” (aka TPM) move onboard the CPU die. That’s why I’m stuck on the Prego tag line “It’s in there!” It succinctly sums up what Pluton is and does.

On November 17, MS Director of Enterprise and OS Security David Weston wrote a post to the Microsoft Security blog. It explains Pluton nicely. The post is entitled “Meet the Microsoft Pluton processor — the security chip designed for the future of Windows PCs.” Therein, Weston reveals the notion of a ‘Pluton Processor’ as something of a misnomer — but a useful one.  Here’s what he says to help explain Pluton, already “pioneered in Xbox and Azure Sphere.” (Note: I added the emphasis in blue bolded text):

Our vision for the future of Windows PCs is security at the very core, built into the CPU, where hardware and software are tightly integrated in a unified approach designed to eliminate entire vectors of attack. This revolutionary security processor design will make it significantly more difficult for attackers to hide beneath the operating system, and improve our ability to guard against physical attacks, prevent the theft of credential and encryption keys, and provide the ability to recover from software bugs.

Thus, Pluton is not really a processor per se. It’s a set of circuitry included on the die and tightly integrated into the CPU itself. This prevents attacks on communications lanes between a physically disjoint TPM chip and the CPU.

There’s a Scare Factor There

Apparently, recent research shows that the bus interface between TPM and CPU “provides the ability to share information between the main CPU and security processor…” At the same time, “…it also provides an opportunity for attackers to steal or modify information in-transit using a physical attack.” (Note: the preceding link takes readers to a Pulse Security research paper. It explains how sniffing attacks against a TPM permit BitLocker key extraction, used to read an encrypted drive.)

The Pulse Security paper describes ways to boost security to foil such an attack. But MS apparently took the work very seriously. In fact, it introduced Pluton to make communications lanes between CPU and a security processor  impervious to attack.

Can Pluton Boost Windows PC Security?

Sure it can. It will indeed make sniffing attacks like those Pulse Security describes nearly impossible. And it should usher in a new, more secure approach to computing. This applies directly to handling “credentials, user identities, encryption keys, and personal data” (Weston’s words).

The real key, however, is that MS has all of Windows CPU makers on board with Pluton. That means AMD, Intel and Qualcomm . It will be interesting to see how long it takes for them to incorporate Pluton into their CPUs. We’ll wait awhile before the first Pluton-bearing chips hit the marketplace. I’m betting that Pluton will show up for both Windows Server and client OS chips as well (that’s not explicit in Weston’s post).

My best guess is that we’re probably two generations out. For all three makers of CPUs mentioned, it’s likely that their next-gen designs are too far along to incorporate the redesign and layout rework that incorporating a security facility on the die will require. That’s why it’s more likely two (or more) generations out, IMO. Stay tuned, and I’ll keep you posted.

Facebooklinkedin
Facebooklinkedin

20H2 RDP Mystery Remains Unsolved Until …

I’ve been raving about the SFF Dell Optiplex 7080 Micro a fair amount lately. I remain convinced it’s a good purchase and will be a great machine for long-term use. That said, there is the proverbial “one thing” that lets me know for all its glories, it’s still a Windows PC. I’ve been dealing with an RDP mystery — as shown in the lead-in graphic for this story — that actually affects RDP traffic in both directions. Its 20H2 RDP mystery remains unsolved, as all my troubleshooting efforts so far have failed.

Read on, though: I did eventually figure this out, and get RDP working. It turned out to be a basic and obvious oversight on my part. Sigh.

What Do You Mean: 20H2 RDP Mystery Remains Unsolved?

Despite chasing down a large laundry list of things to check and set, I get password related errors when trying to RDP into or out of the 7080 micro. The lead-in graphic shows what happens when I try to RDP into the box. When I try to RDP out of the box, I get an out-and-out invalid password (“may be expired” error) instead.

Obviously, something funky is up with authentication on this Win10 install, because when I try to access the device through the File Explorer network connection, I get a request for network credentials, too. Again, presenting valid credentials doesn’t work. I see a “not accessible” error message instead:

Here’s the list of what I’ve tried so far:

  1. Double-checked Remote Access is enabled.
  2. Relaxed all relevant settings in Advanced Network Sharing for Private, Guest/Public, and All Networks categories.
  3. Enabled all Remote Access checkboxes in Defender Firewall settings.
  4. Ran the Network Troubleshooter
  5. Ran the Microsoft Support and Recovery Assistant

It’s the Account, Stupid!

After noodling about with this for a couple of hours I realized that I had defined a local acount as admin. Worse yet, I had not promoted my Microsoft Account on the Optiplex 7080 Micro from ordinary user to administrator.

Because I was using my MS account credentials to attempt network login and access, I didn’t have permission to do the password lookups in LSASS needed to make the process work. Once I promoted that account to admin level, everything started working.

Sheesh! Talk about an obvious mistake. As with many problems with Windows 10, this one turns out to be entirely self-inflicted. At least, I know who to blame!

 

Facebooklinkedin
Facebooklinkedin

Author, Editor, Expert Witness