Step 1: Install Homebrew
Go to brew.sh and download and install HomeBrew. HomeBrew is a Mac OS package manager, a CLI utility for downloading and installing binaries for Mac OS.
Step 2: Install Macintosh App Store CLI (Command Line Interface) utility via HomeBrew
From your terminal, run the following command,
brew install mas
This will only take a minute or so for the application to download and install.
Step 3: Download old macOS via the mas-cli utilty
The Mas CLI will let you download anything that you have purchased in the past. The way it works is running the command followed by the app store ID number. The following code would be used to download 10.7.
mas install 444303913
Below is a list of Mac Store IDs for older versions of OS X. Note: you'll need a valid Apple Store ID that "purchased" older Mac OS versions via the App store You can see your purchases and their IDs using
- OS X 10.7 Lion：444303913
- OS X 10.8 Mountain Lion：537386512
- OS X 10.9 Mavericks：675248567
- OS X 10.10 Yosemite：915041082
- OS X 10.11 El Capitan：1018109117
- macOS 10.12 Sierra：1127487414
- macOS 10.13 High Sierra：1246284741
Bezos-owned Washington Post published a large story on Amazon fake reviews, but that's just the beginning. Years ago I bought an Anker Bluetooth Keyboard. I gave it three stars despite its rock-bottom $20 price, as I couldn't imagine using it beyond the lightest use cases as the feel was abysmal (keys were squishy and wobbly). It technically "worked" but even if I were given one for free, I'd still would have not like the product. Price and value are interconnected, but there's a floor where simply the value of an object ceases increase regardless of how low price as my affinity for the product will not increase.
What followed was Anker harassing me to change my review. I admired they wanted to "fix" the situation but the product worked as advertised so there wasn't anything to fix. Had they left at that, then this would have been the end of the story.
Anker e-mailed several times, although I deleted a few of the e-mails. I believe the count was roughly 5 emails, near daily. I grew tired of it and contacted Anker, and they inferred that they were willing to give me at least more Anker products (it wasn't entirely clear) beyond a second keyboard to reconsider my review. Anker crossed a narrow ethical line. They didn't offer me money, but they were willing to sweeten the pot for me to reconsider under the implication I'd consider changing my star rating. Long story short, I did not change my review nor did I take Anker up on offers. The keyboard was mediocre and short of re-engineering it, there was nothing more to say. Giving a second mediocre keyboard, wouldn't have equated to one good keyboard.
I'm willing to bet Anker didn't violate any of the terms of service for Amazon and the value of a real customer (me) changing my review, in exchange for some bonus swag (cables? I'm speculating) they, essentially buy a review without "buying" a review. This struck me as insidious behavior and a general mistrust of "Amazon" brands, products that seem to exist entirely in the Amazon ecosphere.
I realize I'm singling out Anker but it's also the company I've had first-hand experience with. Even in 2013, fake reviews were a problem but this alerted me to a new form of review inflation. I've been far more skeptical of the "Amazon" brands since, those companies that seem entirely exist in the Amazon eco-sphere, usually selling cheap electronics or knock-offs of more popular products with strikingly high reviews, like popular Symphonized, selling stylish headphones at cheap prices.
Does Symphonized pay reviewers or harass negative reviewers? I don't know and that's problematic.
Cubase - Error Changing permissions when Installing (for any version of Cubase, Cubase Artist, Cubase Essentials) Fix
I recently ran into a problem trying to install the latest version of Cubase. Below are a few of the errors I encountered:
Error changing permissions in 0755 in /System/Library/Extensions/AuthenticationSupport.plugin
Error changing permissions in 0777 in /System/Library/Extensions/AuthenticationSupport.plugin/Contents.plist
Error changing permissions in 0777 in /System/Library/Extensions/AuthenticationSupportEnabler.plugin
My initial inclination, being a developer, was to use the terminal and sudo chmod the permissions which didn't work. If the previous statement doesn't mean anything to you, chmod is a unix utility that's part of macOS but only accessible by the terminal, that can change the permissions of files (edibility) of files.
This, of course, did not work as I encountered the same problem when attempting to update drivers on my Mac Pro for a nVidia graphics card.
macOS post 10.10, features System Integrity Protection, which prevents various system files from being modified by other software. This is a good idea except when it creates a problem like trying to update Cubase from versions. For me, I was upgrading from Cubase Pro 7.5 to Cubase Pro 9.5, but this could happen with Cubase 8, Cubase 8.5, Cubase 9 and the various versions of Cubase like Cubase Artist and Cubase Elements.
The process is as follows: Disabling the System Integrity Protection, installing the software and re-enabling System Integrity Protection. The steps are outlined in my nVidia post but below are the same instructions.
Verify you have System Integrity Protection enabled. Go to the terminal (the macOS terminal is located in applications/utility) and type the following command into the window. This should return "enabled".
Restart your Mac and hold down the Command-R keys during startup to boot into recovery mode. (alternately, hold the option and select the recovery partition). The recovery partition will take longer to boot than normal.
You should see the macOS installer prompt screen. Ignore it and go to the Utility menu and select the terminal option. Run the following command in the terminal.p> Ignore the installer prompt and select from the Utility, Terminal and run:
Reboot normally. Install the Cubase software, even if the rest of the software installed successfully.Step 5
Reboot again into Recovery mode and go to the terminal. Run the following to re-enable System Integrity Protection.
Now you can reboot normally and start using your software!
Recently I've been hit in the past week or so two separate requests to fix broken links on old blog posts, each 4 years old or older. The first is a "Nice try" for for a rather crappy tech blog. Comparitech seems to a form spammer. Comically, the example I found is from the FreeBSD Pipermail mailing list about an archived article from 2002, about VNC portal mail configging. The bot suggests linking to an article explain the difference between VNC and a VPN.
Ellen Fisher <firstname.lastname@example.org>
3:50 AM (9 hours ago)
I found a link that isn’t working on one of your pages and thought you’d want to know.
I landed here - http://blog.greggant.com/posts/2013/10/17/53-mac-only-design-development-utilities-apps.html, and noticed you have a link to the Webgraph Facebook Blocker tool (http://webgraph.com/resources/facebookblocker/) which seems to have been discontinued.
We have a guide to help people stop Facebook tracking them across the web - SPAM URL removed
If you are updating your page, perhaps you could point people to our guide instead?
I hope this helps!
Yeah, I'm not going to do that. The guide was very so-so, and a bit out of date to boot.
The second instance is interesting for the persistence, three separately e-mails spaced out. The link in question, was to a website offering a pirated flash version of Plants vs Zombies. As I do not have flash installed, I couldn't comment onto the quality but likely it was advertisement loaded.
Jessica Bridges <email@example.com>
Are you able to please update something on your website?
You were linking to the Plants vs Zombies game on this page of your website - http://blog.greggant.com/posts/page8/
The link was going to this game - http://www.popcap.com/games/pvz , but I guess since popcap sold PVZ to EA they took the game away....
Here is a secure working version I found on Google - SPAM URL REMOVED
Hope it helps! Classic game =)
Digital Artist & Illustrator @ Jess Creative
The spammer tries to engage again.
Jessica Bridges <firstname.lastname@example.org>
I emailed a few days ago about the Plants vs Zombies broken link on your site, wondering if you had the chance to update it yet?
Don't mean to pester you, just my OCD talking =)
Jessica Bridges <email@example.com>
Last email I promise =) Just wondering if you've received my emails below about the broken link? I don't mean to be a nag, I'm just kind of a nerd for these things =)
My guess is these are bots pre-programmed with to search the bowels of google for links or broken links as an angle to target small websites to correcting URLs as a way to gain standing via backlinking to gain page weight in Google. The Jessica bot is interesting for the follow ups. My theory is it'd spammed me repeatedly even if I had changed the link.
Let's just say I'm not a fan of Apple's decision to remove the headphone jack. Rather than recant my entire rant, the long and short is Apple removed the headphone jack to sell it's W1 headphones, knowing the shortcomings of Bluetooth. The W1 headphones provide a better user experience than Bluetooth alone can provide, and Apple has yet to license the W1 technology outside its own Beats headphones. While the iPhone audio isn't "closed", as any Bluetooth headphones will work off the shelf, it has placed Apple/Beats headphones with an advantage. Any argument pro-headphone jack removal has to contend with this reality that Apple is nudging consumers is placing a squeeze on 3rd party headphones, and the headphone jack represented a port that Apple had no way to subjugate. Pundits cheered as the noose tightened.
Pictured: The bulky Fuze case was the first case that offered a headphone jack.
Since owning the iPhone 7, I've owned several failed products, the most significant let down being the Fuze case, a half-baked product that provided a jenky non-MFI headphone jack and questionable battery case. It was bulky. Worse, it just didn't work well. It didn't support headphone controls or headphones with microphones. The battery case required to be powered up and down, and if the case was out of battery, then the headphone port would fail to work. Also, the case occassionally failed to be recognized. The company turned out to be a bit of a scam too, closing up shop only to re-appear as powerpluscases.com, selling the same crappy case.
My second try was a Veniveta iPhone 7 case, which was simply a bluetooth headphone port stuck to a case. Ironically this half-baked case was far more viable than the Fuze, despite the shortcomings. Again, headphone controls didn't work. The case required independent charging, and its Bluetooth experience was glitchy, often failing to connect the first time I fired it up. I was able to put up with it as it had the same problems as the Fuze, without the bulk and a bit more reliably crappy performance.
Pictured: The veniveta lasted about a year before failing to hold a charge.
Looming forever, has been the Incipio OX, a case made by a reputable case maker. Every few months since its announcement, I'd e-mail Incipio about the status. Finally, when I went to check on the mythical case, I found it was shipping. I ordered. It's somewhat pricy at $69.99, but I used a 15% off coupon I found with a little google-fu bringing it down to $59.50. The order shipped the day I ordered it (with free shipping) and only took three days to arrive via USPS.
Pictured: The veniveta lasted about a year before failing to hold a charge.
The OX is low profile, akin to the sort of cases iPhone users have been used to since it's inception, a rubberized plastic modeled case that fits snuggly to the iPhone. Unlike the Fuze or the Veniveta, it functions as a protective case, provides razor-thin margins to keep the camera lens from protruding beyond the case and a scant millimeter lip around the screen, providing protection from the screen resting on surfaces. It's soft to the touch and reminds me of the official Apple iPhone cases. This will protect your phone and feels as impact resistant as any high-quality low-profile case. It's stylish in the way any case is. Nothing beats the look of an uncased iPhone, but if you're wrapping it up, you won't be visually offended by the Incipio.
Snapping on the case is pretty simple, and requires little effort, it only requires lining up the lightning port and plugging it in. I was a bit unnerved when I received "Unsupported Device" messages from the case, but I'll get to that in a minute. The volume and power buttons are covered but remain easily accessible and easy to press. Lastly, the case adds a bit of a chin to the iPhone, with two ported sections to project the internal speaker. It's novel as it makes the iPhone speaker directional and more effective.These are the little things that separate Incipio from Indiegogo would-be case makers.
After plugging the case in, and receiving the device not supported I was worried. I plugged in my headphones, pressed the play/pause button and.... it worked. I then proceeded to plug my phone into my car charger and plug it into my deck. My iPhone was charging AND playing music at the same time. Subsequent case fittings, I haven't seen the message since so I'll chalk it up to user error.
I tested it with multiple sets of headphones, (Massdrop x NuForce, Symphonized NRG, Klipsch X11is, Beyerdynamic DT-990s & DT-770s) and every last one worked. Pulling out the headphone jack paused the audio as expected. The only minor hiccup is I didn't seem to have discrete volumes for the jack detecting the difference between headphones that included controls vs. standard headphones, something that iPhones with headphone jacks were able to do.
The audio quality also was the same as the Apple dongle cables which have haunted me the past year and a half, much better than the Fuze which sounded soft and distance or the sometimes gravelliness of the cheap Bluetooth on the Veniveta.
Pictured: iPhone 7 with OX case vs iPhone 6 with Apple case. The OX slightly is thinner.
It took too long to hit the market but THIS IS THE CASE FOR ANYONE WHO WANTS A HEADPHONE JACK ON THEIR IPHONE. It works, and it works well. It's light, well made, oh and it works. After being burned twice now, I've found new harmony in my life. I'm listening to my earbuds and charging my phone as I type this. It's everything that I've missed from the iPhone 6. I just wish I could have had this case for longer. I haven't had a chance to test it with the iPhone 8, but seeing as the iPhone 8 other than the 0.2mm thickness, my gut says yes.
Right now, as far as I know, it only comes in iPhone 7/8 size and not the plus. The only other game in town is yet-another, IndieGogo campaign, this time by Encased, for their product called the "AudioMod", another bulky battery case with a headphone jack, advertising versions fo the iPhone X and Plus variants. It looks more promising than the faceless brand behind Fuze. Personally, The Incipio is exactly what I want as I'm not fond of battery cases but at least iPhone X and Plus owners can join the party. Here's hoping to that Incipio continues the OX line.
It's been on my to-do list but as out-of-site, out-of-mind problems go, I hadn't gotten around to it prior. Now I have. There'll be a day or so of a "self signed" security error and after this blog should then be 100% HTTPS friendly.
"Where I think this whole saga gets very frustrating for a lot of current and potential Mac Pro customers is that Apple is describing a product — a powerful, professional-grade, modular desktop computer — that already exists: it’s the tower-style “cheese grater” Mac Pro. While Apple is working away to reinvent one of the most critical components of a professional user’s workflow, those users are stuck with product choices that may not quite fit." - Nick Heer, Pixel Envy.
This should be embossed onto Apple's Professional Workflow's HQ. To paraphrase Paul Haddad, just throw some Xeons in a box. This should be easiest product release in Apple's entire lineup. Pros just want a box that can house multiple storage devices, PCIe slots, the latest I/O (even thunderbolt is entirely optional when you have PCIe) and lastly, user serviceable. That's really it. They could literally reuse the case from the Power Macintosh 9600 and we wouldn't care.
Apple envisioned the 2013 as a Mac that could be carted onto the set of a Hollywood style shoot and edit dailies on the spot with Final Cut Pro X, but conceptualizing it in an entire vacuum. While Apple takes the approach the customer doesn't know what they want, that's true in the consumer market but a massive mistake when you're dealing with professional. They know exactly what the want.
If you want evidence of the demand for such a mythical device: search 2012 12 core Mac Pro in ebay and try and name another computer. Many cost more than the current 5k iMac, new from Apple.
Years ago, I posted a guide on how to install a GeForce 760 or 770 into a 2008 Mac Pro. I included a fair amount of benchmarks to boot. It's lasted me well over three years and made the jump to a 2010 Mac Pro but I finally pulled the trigger on a 1060. You can install a 10x0 series into a 2008 Mac Pro as well, but this guide specifically focuses on the 201x Mac Pros. The main differences between the two are the PCIe power port positions and the lack of the annoying PCIe bar hanger latch. Upgrading only took me a few short minutes, the longest part of the process was plugging/unplugging all my connected devices. There's hardly any special skills or knowledge needed.
Before you get started, there are a few things one should be aware of:
- Both AMD and nVidia make EFI compatible graphics cards that will work on OS X. nVidia cards (GeForce 700 through 1000 series) only require installing the web drivers whereas the Sapphire PULSE Radeon RX 580 8GB is (so far) is the only RX 580 that works without any hacking/flashing.
- The nVidia drivers currently require 10.12 Sierra or above to use the 1000 series cards.
- The nVidia (nor the AMD RX 580) card will not allow you to see the EFI boot screen with the card plugged in (the screen you see if you hold down the option key and the Apple logo). If this is important, I highly recommend keeping an original card around (or flashed). I personally use an ATI Radeon HD 2600 XT (so old that it's not AMD) that shipped with my 2008 Mac Pro computer since its modified to be fanless but any will do, flashed or factory as long as it can display the Apple logo on boot. You can operate the computer without a card capable of displaying the EFI boot screen. However, you’ll have to manage booting using Start Up Disk in OS X and use the bootcamp tools in Windows to switch boot drives and you will not see any picture until the login screen.
- The RX 580 and GTX 1060 are fairly evenly performant but as of writing this, the 1060 is cheaper since any model will suffice, and requires less power and can be found to be significantly quieter in some models.
- Modern graphics cards require additional cabling and rarely do the graphics card ship with additional power cables. You'll need to purchase the power cables separately, also, the Mac Pros require mini PCIe to PCIe power cables.
- Modern GPUs are quite performant (still) on Mac Pros. A 2010 Mac Pro with a GeForce 1080 eats an iMac 5k alive in GPU tests (unsurprisingly).
- Not every GPU port may work with the nVidia drivers depending on the card config. In the case of my GeForce GTX 760, all ports worked sans one of the DVI ports. As a general rule, count on most but not all ports working and do diligent research. The best places to check are MacRumors and TonyMacX86 forums.
If you're upgrading from a stock card, you may be unaware that the PCIe bus doesn't deliver enough power thus PCIe power additional cables are required. The Mac Pros include two power ports for PCIe power but use special low profile cabling often referred to "Mini PCIe".
The Geforce 1060 / 1070 / 1080 require external power. Also, the 1060 requires an 8 pin power cable, the Mac Pro defaults are 6. You'll need a 6 to 8 pin power adapter. I ordered the following: two of the mini PCIe to PCI-e Power Cable (disregard the G5 mislabeling) and a 6 to 8 pin PCIe power adapter, which are much more easily found.
This may differ between card manufacturer, but the following is true for the base models.
- GTX 1060: 2x mini PCIe to PCI cables, 1x PCIe 6 to 8 pin adapter
- GTX 1070: 2x mini PCIe to PCI cables, 1x PCIe 6 to 8 pin adapter
- GTX 1060: 2x mini PCIe to PCI cables, 2x PCIe 6 to 8 pin adapter
The MSI GTX 1060 is massive, roughly 11 in x 5.5 in x 1.5 in thanks to the oversized cooler.
Next any off the shelf GeForce GTX 1060 or GTX 1070 or GTX 1080 will do. Personally, I picked up the GTX 1060 MSI Gaming X 6 GB, which is regarded as one of the least noisy cards on the market. With bitty coins wrecking pricing, I just wasn't willing to pay for the 1070. I hope all crypto currency fails so we can go back to normal pricing, but I digress. I paid $355, which isn't great but many of GTX 1060s makes are going for more.
Plug in your power cables first! The GeForce 1060 is big; it dwarfs my 760. Fortunately, the Mac Pro 2010 / 2012 ports are much easier to access than in a 2008 Mac Pro.
The low profile mini PCIe power cables are located in the bottom back of the PCIe chamber.
Do the usual remove slot thumb screws, remove/move old GPU etc. The Mac Pro 2010/2012s have a very annoying PCIe rail hanger, which requires pressing forcefully away from the PCIe card to unseat the cards and reseat them. Use the bottom-most slot as the card is dual height.
If you're looking for more information on how to install a PCIe card in a Mac Pro, everymac.com has plenty of information including videos.
I haven't spent much time with the card, but I did fire up on OS X Tomb Raider (2013) via Steam. At 2560 x 1440 with all settings maxed (16x Anisotropic filter etc), I managed an average frame rate of 57.6 FPS on a 12x 2.9 GHz 2010 Mac Pro with 32 GB of RAM.
It's no secret that there's always been a gaming performance gap, macOS sadly scores quite badly compared to its Windows counterpart, so it's only fair to compare Mac to Mac or Windows to Windows and not Mac to Windows when considering the gains. Rather than benchmarking Windows, which isn't my daily driver, I'm more interested in how the GPU affects macOS. Below are my Uniengine v4 benchmarks vs when I ran them against my 2008 Mac Pro. Despite the low marks when compared to running Uniengine in Windows, The Mac Pro 2010 is twice as fast by the benchmarks as my previous setup of a 2008 Mac Pro running a GeForce 760. One of the more fascinating things I learned when trying my hand at a Hackintosh was that the 3rd generation 3770k i7 wasn't quite enough to completely best the over-engineered Mac Pro despite having a faster bus / CPU, but merely matched it. If/when I have more time, I may swap the GPUs to see if the scores are as GPU dependent as they seem.
OpenGL 2560 x 1440 8xAA FullScreen Quality:Ultra Tessellation: Extreme
Mac Pro 2010 (Xeon X5670 2x 2.93Ghz) + GeForce GTX 1060 + 32 GB RAM + Samsung 840 750 GB SSD
Min FPS: 7.4
Max FPS: 72.1
Mac Pro 2008 (Xeon E5462 2x 2.8 Ghz) + GeForce GTX 760 + 14 GB RAM + Samsung 840 750 GB SSD
Min FPS: 5.8
Max FPS: 37.4
Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 760 + 16 GB RAM + Samsung 840 750 GB SSD
Min FPS: 6.9
Max FPS: 37.3
Hackintosh (i7 3770k 3.5 GHz) + GeForce GTX 770 + 16 GB RAM + Samsung 840 750 GB SSD
Min FPS: 7.6
Max FPS: 47.5
Mac Pro 2010 GeForce 1060 vs eGPU setups
I used benchmarks provided by a thread on eGPU.io, credit goes to the forum posters for the comparisons. There aren't any perfect comparisons so here's a run of the GTX 1060 in my Mac Pro 2010 vs Thunderbolt 3 Mac running the considerably better 1070 and an iMac 2011 running a 1060. Depending on perspectives, the eGPUs do quiet well or the Mac Pro 2010 is fairly viable. The big difference in eGPU vs internal.
OpenGL 1920 x 1080 8xAA FullScreen Quality:Ultra Tessellation: Extreme
Mac Pro 2010 (Xeon X5670 2x 2.93Ghz) + GeForce GTX 1060 + 32 GB RAM + Samsung 840 750 GB SSD
Min FPS: 19.3
Max FPS: 106.5
iMac 2011 27 inch (3.4 GHz) + GTX 1060 6GB
Min FPS: 8.4
Max FPS: 96.9
MacBook Pro late 2016 13 inch (2.9 GHz) + MSI GTX 1070 6GB Aero OC
Min FPS: 9.8
Max FPS: 138.8
macOS vs Windows
As previously mentioned, this shouldn't come as any sort of surprise but Windows 10 gaming is still quite a bit of ahead of Apple, although Metal shows promise. As of right now, DX11 is the king regardless of your opinion on it in performance. Windows performs a full 10 FPS faster, or about 24% faster. in the same benchmark with the same settings.
OpenGL 1920 x 1080 8xAA FullScreen Quality:Ultra Tessellation: Extreme
Min FPS: 19.3
Max FPS: 106.5
Windows 10, 64 bit, Direct 3D 11
Min FPS: 21.7
Max FPS: 135.3
I plan to update the benchmarks in time. I may bring in the GeForce 760 for a reference when I have more time and possibly test in a 2008 Mac Pro in the future.
It's a good idea for the first boot to keep around an EFI card, as you may have to enable the web drivers. Also, I encountered the error of "Mac nVidia Web Drivers fail to update or cannot remove Kext files" when updating my OS recently; you'll want to follow the instructions I posted to deinstall the drivers if this happens to you.
Upgrading GPus isn't something I'd normally wax philosophical on, but we're post-golden era for OS X, and the Mac Pro is a relic.
Ever since nVidia has shipped it's web drivers, gone are the sketchy days of flashing a 6970 and using a rom creator. Installing off-the-shelf GPUs has gone from tribal knowledge to common knowledge for the Mac Pro user since I wrote my "how to" guide for the 760. Ironically, it wasn't until Apple killed upgradability that the dream of off-the-shelf GPUs could be bought without the infamous Apple-tax. I debated even not calling this article a "how to". The down side is despite the EFI compatible ROMs preloaded on the 700+ GeForce cards; they're not EFI boot screen compatible on OSX sadly. The only game in town is macvidcards.com which according to all accounts on MacRumors is a legit source, but I find the idea of hoarding an EFI hack a little irksome. It's hard to complain too much as nVidia has quietly kept the Mac Pro and Hackintosh community happy, self-included. There's no specialized knowledge needed to upgrade your GPU or abnormal risks of a bad firmware flash. The only caveat is you'll want to keep an EFI card around for major OS updates.
Upgrading the GPU is probably second best thing outside of an SSD to make an old Mac Pro feel young if you desire to run 4k and/or use any sort of motion graphics software, play games etc. It's hard not to recommend upgrading as there's a strong case to be made for removable GPUs. A Mac Pro with armed with a higher end GPU will best even the mighty iMac Pro handedly in GPU related benchmarks.
eGPUs are viable but not as performant. There's just simply no topping a PCIe card slots although we're probably coming to the end of the Mac Pro era if/when Thunderbolt gets an update. Thunderbolt 3 is fast but still has a lot of room for improvement. It's 40 gigabits 5.1 GB/ is approximately the speed of a PCIe 3.0 4x slot. If/when Thunderbolt gets an upgrade (Thunderbolt 4?) Bumping it up two-fold would bring it to roughly 8x PCIe 3.0 or shy of a 4x PCI 4.0. 8x PCIe currently offers roughly 95-99% of the performance for gaming, even with a GeForce GTX 1080. That said, PCIe 4.0 coming out very soon, and PCIe 5.0 may be only a year and change out, boosting PCIe 16x to a truly mind-boggling 63 GB/s a sec (504 gigabits per second). Thunderbolt won't be catching up PCIe any time soon, but it could be for practical purposes concerning consumer GPUs.
Also to add to the end of the cheese-grater era is the ever-looming Mac Pro. The word "modular" has been tossed around recently quite a bit to describe the next iteration. The Mac Pro flames have been stoked yet again with the very curious mention in Bloomberg's rumor-filled article Apple is said to plan to move from Intel to own Mac chips. It's highly unlikely Apple has anything in the pipeline that's even near the iMac's i9 configurations but will sport the same Bridge2,1 ARM A10 CPU that's found in the iMac Pro. Also, the new Mac Pros are at least out to 2019 and will be shaped by workflows.
The Bridge chipsets allow for some truly unexciting features like "Hey Siri" to be always on even when the computer shut down and/or to manage graphical keyboards like the one found in the MacBook Pros.
My gut feeling is if the iMac Pro is any sort of indicator, the next Mac Pro will be absurdly expensive and my guess is it'll sport less upgradability than the 2006-2012 "Cheese grater" Mac Pros but more than the abysmal 2013 "trash can" Mac Pro. Floating rumors around ARM CPUs seems a step away from modularity but a step closer to iOSifying Macs to annual upgrades, stopping the Hackintosh community and locking users out of OS upgrades after 5 years. I am not optimistic about the future of the Mac Pro or the Macintosh.
The Mac Pro has been a bit of an outlier. I used a 2008 Mac Pro for 10 years. When I bought it, I was still in a 3-year upgrade cycle, going from G3 -> G4 -> G5. I used my Mac Pro 2008 longer than all three computers combined, and only did I recently replace it with a 2010 Mac Pro. That's a significant reduction in computer sales Apple, to engineer a computer that can be used viably for 10 years and I worry they understand that too well. All for the cash, man...
For now, Mac users have only three choices: eGPUs, old Mac Pros, and the elusive Hackintosh. Any path will get you serious gains. My guess is the 1000 series is likely the last stop for most cheese grater users as we're at a crossroads: Thunderbolt is almost fast enough for GPUs (and PCIe enclosure are becoming more popular), and Apple may yet give us a modular computer.
Some minor proofing and added in a lot more benchmarks. Kids love benchmarks.
Final Thoughts ended up long-winded.
With the nVidia graphics cards, in a Mac Pro (for those of us who refuse to let go) or PCIe Thunderbolt brethren, you probably by now are used to updating the drivers with every OS X version. However, sometimes when trying to update the nVidia drivers will give an installation failed after appearing initially to install correctly, ending with generic a "contact manufacturer" error. This error isn't exactly telling the full story, OS X post 10.10 has feature called System Integrity Protection, which protects certain system files from being modified by even the root user, which stops malicious installers/rootkits from tampering with macOS. This error also can adversely sometimes affect no longer used files such as items placed in the "incompatible items" folder, and when the user tries to delete them, will receive a "can't be modified or deleted because it's required by macOS" error message.
It's very important to understand that you should only do this with installers from a valid source before proceeding, such as directly downloading drivers from nVidia and using its certificate check or to remove offending drivers or files. After performing necessary changes, re-enable System Integrity Protection.
First to make sure you have System Integrity Protection, go to the terminal and run
This should return a status of enabled.
Restart your Mac, and hold down Command-R keys during startup. This should boot your computer into recovery mode (alternately, you may be able to hold option and select the recovery partition). This may take a few minutes to boot.
Ignore the installer prompt and select from the Utility, Terminal and run:
Reboot. Perform the necessary change boot back into recovery mode as before.
Reboot. You can now check using the
csrutil statusto see if the csrutil is working.
Long time friend, James Treneman published his first game on Steam, Kite. I saw in its earliest stages; it's a labor-of-love, a one-man operation, and it's now a full game. It's damn impressive that one person could make a game by himself, more impressive that it's a full-fledged game harkening back to Smash TV/Zombies Ate My Neighbors, mixing in RPG elements, missions, and pixel art.
I did something today for the first time in a decade. I ordered a Mac desktop. I've been using my Mac Pro 2008 for one decade, a feat I never realized would have been feasible.
What am I replacing my 2008 Mac Pro with? After evaluating the options, the iMac Pro was just too expensive for my blood with shelf life and the regular iMac just not as beefy as I'd like, especially in the GPU department. I ended up ordering a used 2010 Westmere Mac Pro, 12-core 2.93 GHz. I don't expect to get the same use out of it as my 2008. Just a year or two until we see if Apple does replace the Mac Pro with a modular computer.
By the numbers, the 8-year-old Mac Pro 2010 I'll be receiving bests my 2015 2.5 GHz MacBook Retina in most geekbench benchmarks in most scores. It bests even the current round of iMacs (excluding the iMac Pros) CPU performance wise. It'll be performant enough to be a Media PC/server should I choose to replace it in the upcoming years. It still strikes me as absurd that 12 core Mac Pros still hover around the $900-1800 mark depending on configuration. If that doesn't show demand, I don't know what does. Apple needs a modular computer for a certain class of users.
I've spent a fair amount of time blogging about the Mac Pro. The Mac Pro 2006-2012 remain the high water mark of desktops, the most elegantly designed towers, a refined mix of modularity, ease of access and raw power. Opening up the guts to see the (nearly) wire-free world, with an (almost) screwdriver free experience made cracking open a Mac Pro easier than even the era of the G3/G4 tower famed "Folding door" design. It's the painstaking beauty that really makes one appreciate the industrial design chops of Apple at it's best, features that only are touched a few times over the life of the computer are designed to be pleasant if not down right beautiful. The rare PC case today has a locking door that doesn't require screws. Rarer than that are cases that have sleds for storage. Then there's things that remain unique to the Mac Pro. PC cases still do not have handles or raised feet to this day, have chambered cooling, trays for CPU/RAM, or cable free designs. That's not even touching the aesthetics of the garish and utterly unsightly PC cases that still plague (if not make up the entirety of all) the market.
The end of the Mac Pro wasn't a surprise. You could see the tide receding with the rather modest and unimpressive 2012 update that failed to bring USB 3.0, SATA 3 and Thunderbolt to the desktop arena. The last embers of hope could be seen dwindling of the mythical creative professional smolder with the release of Final Cut Pro X. Laptops have crept into even the most hell-or-high-water desktop users lives as they caught up to their aging out-of-date in performance. Perhaps that's what killed the Mac Pro: engineering a computer that could last a decade.
Love it or hate it, bootstrap has been a mainstay of front-end development since 2011. I've watched it grow and now, dare I say, flounder.
Rather than recant the ups and downs of each generation, Bootstrap 3 was wonderful for its simple flexibility. Most of the time, I whittled down Bootstrap to the bare minimums, often using only its grid (modified with my own breakpoints) and in-name-only classes like
.btn, as they're lexiconic to bootstrap. Any project, I could rely on like-markup and classes to Bootstrap even if the project was largely not-bootstrap. Bootstrap 3's Sass logic was simple and easy, but bootstrap 4 is silly.
- Bootstrap 4 now uses Sass includes for breakpoints. Why? I cannot fathom a reason a realistic reason why. This is counter-intuitive. Everything is include hell.
- Most of the generative sass logic has been abstracted into mixin hell. It's starting to resemble the clusterfuck that is Foundation.
- The cross-dependency of Sass isn't predictable. Example: If you comment out forms, it will break nav functionality. There's a lot of senseless overhead.
- Lite and dark themes are written into the code in such a way, it's not easily abstracted out.
- While small, some of the icons are inlined SVG images, which means removing if custom icons are used, more senseless payload.
Bootstrap 3 was the right mix of complexity to return on investment, but Bootstrap 4? I'm starting to think otherwise. So far, there's not enough compelling for Bootstrap . Conversion to REM units is nice as well as opt-in to Flex box. Dropping IE8 is a good move. Glyphicons need to go for accessibility. The overall CSS is smaller. I like that. The hackability though? Less so.
Often as a developer, you want to simulate the experience of limited bandwidth for people with slower internet connections. Chrome and FireFox have this built into the browser, but it only affects the browser and doesn't provide robust parameters for latency or affect the rest of the experience. Safari doesn't have this, and it's in part to the Network Link Conditioner utility provided as an additional tool.
To install the Network Link Conditioner, you'll need the following:
- Apple Developer account (no paid licensing is required)
- Xcode installed
Next, go to downloads for Apple Developers and sign in. The Network Link Conditioner utility is packaged in with other utilities. Search for Additional Tools or use one of the links below.
- OS X 10.10 users should download Hardware IO Tools for Xcode 7.3
- OS X 10.10 users should download Additional Tools for Xcode 8.2
- OS X 10.12 users should download Additional Tools for Xcode 9
Open up the DMG and install Network Link Conditioner.prefPane by double-clicking it. (Note: in Additional Tools, it'll likely be in the hardware folder)
Using Network Link Conditioner
Open up the system prefs on your computer. Click on Network Link Conditioner and click on/off to toggle it on, and the drop down to use presets. You can create your own with the Manage Profiles.
Congrats, now you can enjoy slow internet.
First I off, I highly recommend reading CSSTricks' Build a Style Guide Straight from Sass, it's a game changer for auto style guide generation. That said, I assume if you're at this page you're already a convert.
I'm going to assume the following:
- node-kss is installed in the same directory as your gulpfile
- node-kss has been set up and is generating a style guide.
- you have at least very rudimentary understanding of gulp
If either of the first is untrue, please go to the CSS tricks link as it's a wonderful guide and will get you a working spot >Node-KSS has a gulp repository but its wantonly out of date. I recommend not using it. Fortunately chaining it's pretty easy. First, we need to install gulp-shell in our gulp project.
Next, we're going to need to require gulp shell in our gulp file, this can vary based on your set up, it may be var or const depending on if you're running ES6 or not or part of a larger declaration:
Next we're going to create in our gulpfile a task to execute the command to run node-kss (note you can run alterations of said command if your configuration is different, kss is not required to be installed in the same place as gulp.)
Lastly, we now need to reference this task in another task. Below is an example of how I'm using it, I created a watch task called "styleguide", a slightly modified version of my default task. Your task will differ from mine
Note that I applied
gulp.run('kss');after my Sass task has run, this will generate a style guide. Since the style guide generates new HTML on every save, my
gulp.watch(appDefaults.watchHTML).on('change', browserSync.reload);is triggered because of my project's directory structure. This is why I created a separate task named "styleguide" as I do not always need my kss task to run, and do not want to interfere with live CSS injection via browserSync. Your needs will vary.
Every now and again, I remember I have a GitHub account and throw something simple up there. I made a Grunt Boilerplate years ago and finally got around to making one for Gulp. There are a few features I still need to stick in, but I like to have a starting point rather than re-inventing my tasks every project.
Features all the greatest hits:
- Sass processing
- CSS Browser auto-prefixing
- CSS minification
- JS Uglify (minification)
- BrowserSync (Inject CSS changes + follow, reload on JS change)
This is mostly for my own benefit, but if anyone finds it useful, I'm glad. You can nab it here Gulp-Sass-JS-BrowserSync-Boilerplate
So you're here because bash is outputting some big mess that looks like the following when you tried to install gulp-sass or node-sass via NPM. You've probably updated Node and NPM, switched versions in NVM or HomeBrew and are beating your head while node-sass isn't installing. The issue is likely not in the node or npm version but the package.json.
Go to package.json and look at the versions. Most likely the version is locked to a very old version of node-sass or gulp-sass in your project (or the project you're using), switch it's version to something recent, (as of writing this, it is
"gulp-sass": "^3.0.0", or
"node-sass": "^4.7.2"). Congrats, it'll now install!
All major browsers have built-in login managers that save and automatically fill in username and password data to make the login experience more seamless. The set of heuristics used to determine which login forms will be autofilled varies by browser, but the basic requirement is that a username and password field be available.
Login form autofilling in general doesn’t require user interaction; all of the major browsers will autofill the username (often an email address) immediately, regardless of the visibility of the form. Chrome doesn’t autofill the password field until the user clicks or touches anywhere on the page. Other browsers we tested  don’t require user interaction to autofill password fields.
Ironically before the holidays, I had to deal with this from the opposite end as auto-form filling from Safari was filling out hidden fields.
Consider the following
- Safari's autofill can fill out more than just username/password.
- Safari's autofill does not give you the ability to view the stored information in its local database other than site entries.
- Safari's autofill will fill out
- Safari's autofill does not trigger a DOM event on display code>visibility: hidden and
display: none. Safari does allow to query for
input:-webkit-autofillbut testing for this means super hacky setTimeout and setInverval hacks.
- Safari does (mostly) respect the HTML5 convention but will ignore autofill off on username or password fields
This leads to a bizarre world where Safari is egregiously handing out info that can't be vetted.
Pictured: Safari's autofill manager for non-username/passswords (other), doesn't allow you to see what information its autofilling or edit the values. I found some surprising entries in my Safari autofill manager.
I had the problem where a donation form was falling our API validation as Safari's autofill was completing hidden form elements without invoking changes and creating scenarios we hadn't previously considered. It took error logging to figure out Safari was the culprit, and a heavy dose of intuition to figure out that it was autofill.
The solution was to add autofill and disabled but lead me to wonder about the potential abuses of autofill. Apparently, I wasn't the only one.
For years I've leaned on ImageOptim as my go-to for image optimization. I tend to be a little obsessive, using modern formats (WebP, JPEG 2000) and testing out avant-garde projects like Guetzli by Google. I recently decided to finally try out Squash by Realmac Software.
Over the years, codecs have improved remarkably, especially in the realm of video: For example: H.261 (1984, 1988) -> MPEG-1 (1988-1991) -> MPEG2 aka H.263 (1996-2015) ->MPEG4 aka H.264 (1999-current) -> High Efficiency Video Coding (HEVC) aka H.265 or MPEG (2015 - current). Each iteration with the ultimate goal of improving video quality with at lower bit rates. This doesn't even cover the other formats, VP8, VP9, Ogg Vorbis, DIVX, 3IVX, Sorenson, Real Media and the many others that occurred the past 30 years which all have had variations of mainstream success. Audio has had a similar vector from LMA4:1, Mpeg, MP2, Mp3, ACC, Ogg, AC3, DTS to name a few.
However, static images haven't had the wide range of codecs (most formats are lossless proprietary files used by various image editors) and have been almost entirely relegated to five formats, SVG, BMP, PNG, JPEG and GIF for distribution. You may occasionally PSDs or EPS files, or photography formats like DNG or standard-free RAW, but those fall into the same category as video codecs like ProRez, DNxHD, Cineform. These are intermediate formats that require specialized software to view/edit and converted when distributed beyond professional means (sans EPS).
We're starting to see future image formats like Google with WebP, and Apple with JPEG2000 and HEIC, and Safari allowing inline MP4s to be treated as images but for the past 10 years, much of the action in image compression has been trying to squeeze out ever last single byte out of the existing formats, almost entirely for JPEG and PNG (and SVG but that's a different story) A lot of the slow movement of web formats has to do with the W3c. It took Cisco buying and distributing the Mp4 patent for free to move MP4 to the accepted video formation for Microsoft, Apple, Google, and Mozilla. It may take some similar act of corporate benevolence to bring a successor to JPEG.
Interestingly though, there's a been a concerted effort to squeeze every bit of optimization out of the existing formats: JPEG has MOZJpeg, Guetzli, JPEGOptim, and Jpegtran. PNG has Zopfil, PNGOUT, OptiPNG, AdvPNG, PNGCrush. These all differ as some are encoders, and some are strictly optimizers but the end game is to extract the most out the formats which often involves trickery to exploit the compression. Both ImageOptim and Squash are GUI front ends that make use of these optimizations to create the best JPEG or PNG per kilobyte possible. These libraries do not come without a penalty, that being CPU cycles. These all can take minutes to execute on larger images, and the longest being Guetzli, a 8 MP image can take around 40 minutes to encode even a 5th generation Core i7. We're probably quickly approaching the end of the law of diminishing returns. If you're using Guetzli, I'd argue it's easier to provide alternative image formats (WebP / JPEG 2000) as opposed to burning hours encoding a hand full of images as you'll get better results for the people who can see them (Safari and Chrome users). The rest, however, are still viable.
PNG Compression tests
Settings used: ImageOptim (default)
- Strip PNG meta data
- Optimization Level: Insane
- More Compressed (slower)
Test 1: Complex Webpage screenshot
- Original: 2.3MB (2,298,405 bytes)
- ImageOptim: 1.7MB (1,685,034 bytes)
- Squash: 1.7MB (1,663,095 bytes)
Kaleidoscope Show differences results: >No differences
Squash Savings over ImageOptim: 21,939 bytes (21.9K), 1.3%
Test 2: Simple Webpage screenshot
- Original: 33K (33,173 bytes)
- ImageOptim: 18k (18,538 bytes)
- Squash: 18k (18,538 bytes)
Kaleidoscope Show differences results: No differences
Neither of these is terribly surprising, Squash uses LibPNG and Zopfil, which are open source PNG optimizations. I'm a little surprised that Squash shaved off a few more K. To make sure this wasn't a fluke, I tested another screenshot, 2.9MB (2,880,886 bytes), again Squash 2 won, (1.1 MB) 1,116,796 to (1.1 MB) 1,140,793, for a savings of 23,997 bytes (24k). On very large PNGs, Squash 2 has the advantage. I checking PNGCrush, brought it down 1,126,420 bytes.
Test 3: Large Photograph
- Original: 10.4 MB,(10,403,651 bytes)
- ImageOptim: 6 MB (5,927,713) bytes)
- Squash: 5.6 MB (5,597,048 bytes
Kaleidoscope Show differences results: No Differences
This last test weighs in the most for the favor of Squash, 330,665 bytes is significant, even if only a 6% difference
While hardly the epitome of comprehensive testing, Squash does provide slightly better PNG compression. That said, ImageOptim is quite good for the sticker price of free. Squash 2 is part of SetApp collection or $15 stand alone. Squash isn't as accomplished in JPEG optimization as ImageOptim but seems to be best PNG GUI utility for OS X. It's surprising too, as ImageOptim offers more options for optimization and the same optimization libraries. You can't really go wrong using either utility.
Mini Review of Squash
Squash is essentially a drag and drop no brainer utility, drag images in and Squash does the best. If you've used ImageOptim then you're familiar with it. The big differences between ImageOptim and Squash are mostly cosmetic as both do the same operation. Squash appears to be no faster than ImageOptim nor does it have has as many options. The UI does provide a goofy animation and annoying sound (I killed the sound effects immediately).
Where Squash won at PNGs, it lost out on lossless JPEG compression. Test routinely showed that ImageOptim shaved off on average about 5% more off JPEGs although individual tests differed wildly.
Squash 2 is a minimalist utility through and through. Drag images in and it outputs compressed ones. Quite possibly the best thing Squash offers over ImageOptim is one of the most simple, it allows you to create new versions of the file appended with a suffix. ImageOptim overwrites images which can be undesirable.
There's been a bit of a cat and mouse game between adblockers/content blockers and advertisers/analytics/trackers. The short answer is you aren't going to defeat them single-handedly. Many of the libraries designed to detect them will fail as they're inevitably blocked once a content blocker is updated to detect them. As someone who once ran a website, that hit 150,000 unique visitors a month funded by advertising, I'm sympathetic the publisher's plight. As a content writer, I value analytics, I use google analytics on this site as it helps me understand what content resonates, what channels people use to find my content and how they consume it. As developer with a touch fo UX, logging and error tracking is extremely helpful. A service like loggly can help me find errors, and design better to catch edge cases that aren't on the "happy path" and make data-driven decisions about a product. However, the advertising industry has perniciously proven they are not to be trusted. There's a reason why as a user I surf with Ghostery/1blocker, block cross-origin cookies (on my desktop, kill all cookies), use a VPN, and disabled flash long before most people to dodge the dreaded forever flash cookie. Privacy matters.
This is my attempt create an ethical framework around content-blocking from the perspective of a developer/content create/publisher.
A quick list of observations
I've assembled a list of facts/observations about content blockers.
- Adblock/Adblock Plus focus on advertising but not analytics. This could change in the future.
- 1blocker and Ghostery are particularly good content blockers. Both will block
<script>tags from loading, or any
onerrorcodes at the src level
- Content blockers are not fooled by appending
- 1blocker and Ghostery will not be removed from the DOM, thus any checks to see if they exist will be true.
- 1blocker and Ghostery can detect anti-blockers popular scripts and prevent them.
- Browsers are more aggressively pushing privacy settings, FireFox leading the charge and Safari not far behind.
- If your website fails to work with one of the popular content blockers working, you are cutting out 20% of audience.
But I'm a special snowflake!
Using powers for good
So as a developer/UX designer you're suddenly faced with a problem. Your website or web app has features that break when content blockers are enabled. You've already made sure that your core functionality isn't tied to anything that will be blocked by content blockers.
Likely your client or manager will ask "can't you just go around the content blocker?".
The short answer is "No". You will not forcibly defeat content blockers, and if you try, you're signing up for the unwinnable, all consuming, cat and mouse game. However, you can potentially detect content blockers, rather than defeat them. With a service like Loggly, you can easily check if the
_Ltrackervar has loaded.
Suddenly we're at the ethical precipice as we can do a number of things with this information. I've assembled a list of the ethical paths.
Ethics of content blocking code
Website/WebApp's core features work any warnings until user reaches an ancillary feature that may be broken. User is able to complete core functions (consume content, use navigation, submit forms).
Example: Videos still work. User is able to place orders but 3rd party chat tech support may be broken. User is informed.
User receives warnings on every page, encouraging to whitelist site regardless if functionality is affected.
Example: User is pestered with a whitelist site message. User is still able perform operations. Videos still work. User is able to place orders. 3rd party live chat tech support may be broken. User is informed.
User is blocked from consuming content until site is white listed regardless if functionality is affected.
No Ethical Stance: Site does not attempt to detect any blocked content. Site either functions or does not. This is the majority of websites.
This model isn't free of problems, its almost entirely from the lens of a non-advertisement supported website, like a campaign site / company site/ ecomm / SaaS. While these sites may contain advertising and tracking, all the aforementioned are either have revenue generated by sales (Sass/Ecomm) or lead generation (Campaign/Company). Websites that are dependent on ad-revenue adhere a different set of ethics and variables.
Other methods for checking for a script loaded.
Checking for variable existance is the most fail safe method to see if a script has loaded. While the
onerrorwill not work on an individual scrupt tag, you can write in scripts to the head with the following code. This though comes at a mild expense of code execution and may not work in all scenerios.
subscribe via RSS