First post of the New Year. Still got it in January. I moved States, bought a house, got a dog. So blogging has not been at the top of the list. Now that I emptied my storage unit, I found my old buddy Tinman. He’s a Core i7 ‘server’ with two GTX 480’s and 2TB of space. He’s old, but sturdy. He deserves a place of glory. And and Arch installation.
If you followed my last article on this subject, you have a GPG setup without the master private key on your computer. You also put an expiration date on that key. When your key expires, it can no longer be used to encrypt data for you. It can however still decrypt messages from prior to the expiry date. This does not help in the situation where your private key is compromised.
I notice myself asking questions about common tasks when I come back to Arch after a long time. I think that makes for a good article as it is possible others have these questions. I’m running out of space and that’s stupid. Not a question, but I’ve felt that exact sentiment. In Arch, this is very likely the Pacman cache going crazy. Everytime you upgrade a package in Pacman, it keeps the previous version.
I encountered some issues running WebStorm and I think I finally solved them. This was an annoying bug that was a mix of WebStorm and OSX behavior that I didn’t ask for. TODO: Pinentry Image This is the prompt that you get on OSX running Pinentry. I was getting this randomly all throughout the day just running JetBrains products. That’s really annoying. You can see that WebStorm is trying to run git-upload-pack when I get the Pinentry prompt, and that is what triggered it.
I’m using the gpg-agent in place of the ssh-agent. I think this is a very interesting use because it eliminates the need for me to store my ssh key as a flat file: ssh-add -l 4096 SHA256:rsOIZD3XP+Tvj+l5xrbRnxgvdg2qKL5agAxzPLT5rao (none) (RSA) 2048 SHA256:U6ETCKbdPbvgPMSjePS0jrGR3yMdhF9NC6MUHItynJc /Users/admin/.ssh/splice-dcos.pem (RSA) ... You can see here that the top key is one that is generated by GPG and not associated with any particular file. That being said, I still have to use SSH keys that are given to me for work.
Airgapping GPG Airgapped Media Format I’ll save you the explanation of what GPG is. First thing I needed was a USB stick that would be compatible with Arch and MacOS. This fight is always interesting and I would rank the better operating system as the one that can comprimise on the filesystem. Naturally, insert the USB stick in the MacOS and format it as a Journaled HFS+ partition because we can install the driver on Arch.
I have trouble sometimes navigating network connectivity using Arch. The OS doesn’t really have an issue getting out to networks, I just seem to not remember what to do in a time of need. Unfortunately, it is usually the first thing you need to do to begin the problem solving process. It is not lost on me that I am posting this article on the Internet. You’ll find it after you get in to trouble.
This seems to be a bit of a nightmare with Arch. I think the issue is that it is difficult to test. I typically connect to pretty good networks that have passwords and reasonable security. On vacation though, I don’t have a choice of what to connect to. I really don’t know what the advantage of WPA Enterprise is with the stupid login page. There is no way that is more secure.
You can set your local timezone to look at the files in a sensible way. Everything is stored in UTC. timedatectl list-timezones sudo timedatectl set-timezone America/Los_Angeles timedatectl status Local time: Wed 2018-04-18 11:34:46 PDT Universal time: Wed 2018-04-18 18:34:46 UTC RTC time: Wed 2018-04-18 18:34:45 Time zone: America/Los_Angeles (PDT, -0700) Network time on: yes NTP synchronized: yes RTC in local TZ: no You can see all the ‘units’ that you have using the following command.
Before going on, make sure you meet the requirements: General recommendations after an Arch install can be found here. I’m going to use this article to collect my list of requirements. Okay, you have to install sudo. You also need a user. And a sudo group. use visudo to edit the sudoers file. groupadd sudo will add the sudo group. then you can gpasswd -a drone sudo to add drone to the sudo group.
I wanted to install Arch because all the cool kids do it. Really, what I wanted was more control over the system and the decisions I had to make to get it to work. In the past, I have favored speed of installation and stability of the installation because I didn’t want to spend time configuring stuff that didn’t relate to what I wanted to accomplish. That basically means LTS Ubuntu.
Motivation I wanted to get my environment ready for Android development. It was pretty straightforward on the Mac, I can see it being more of a headache on other platforms. Procedure Firstly, download Android Studio. It is pretty amazing that they’ve wrapped everything you need in a free IDE. There’s a ton of shit for Java projects that I’d rather not think about. I’m using this training course. Open it in another tab.
Zapier Zapier gives me a list of things that I will need to configure for the zapier app to be able to communicate with my app vi OAuth. I think this is a good way to call out the routes I need to build. Authentication Type OAuth V2 w/refresh Client ID AKA “Consumer Key” or “API Key” zapier Client Secret AKA “Consumer Secret” or “API Secret” asdjhalsgdhgaposdigu (some string) Authorization URL For now I will list what I see the consent app presenting.
These are the steps (both up and down) to get get the Hydra Consent Flow example working locally on a Mac. Infrastructure Create Database Up echo "create database hydra" | mysql -uroot -proot Down echo "drop database hydra" | mysql -uroot -proot Export Stuff Up export DATABASE_URL=mysql://root:[email protected](docker.for.mac.localhost:3306)/hydra?parseTime=true export SYSTEM_SECRET=yJFLU44byGmmKLwJHvramNknAmSQR27C Down Nothing. Migrations Up docker run -it --rm \ oryd/hydra:v0.10.10 \ migrate sql $DATABASE_URL Down Nothing. Run Hydra Container Up docker run -d \ --name ory-hydra-example--hydra \ --network hydraguide \ -p 9000:4444 \ -e SYSTEM_SECRET=$SYSTEM_SECRET \ -e DATABASE_URL=$DATABASE_URL \ -e ISSUER=https://localhost:9000/ \ -e CONSENT_URL=http://localhost:9020/consent \ -e FORCE_ROOT_CLIENT_CREDENTIALS=admin:demo-password \ oryd/hydra:v0.
Okay this is my first foray in to AppleScript, and I have to say that I don’t hate it. I wanted a way to take screenshots directly in to Preview on my Mac. Turns out that the screenshot utility is available on the command line: > screencapture -h screencapture: illegal option -- h usage: screencapture [-icMPmwsWxSCUtoa] [files] -c force screen capture to go to the clipboard -b capture Touch Bar - non-interactive modes only .
Let’s say you want to know if your boss is away on vacation next week. So you call their admin and say “Can you double-check that my phone number is correct if the Boss is out next week?”. They load up the boss' calendar to check and based on his presence next week then load up your info. Only once done, do they take the time to remember the boss didn’t want you to know wether they are in or out.
Background Zapier is effectively a task runner. Integrating with them would put FlowMojo in a marketplace of other productivity applications. I think it will really fuel adoption. To get in the door with Zapier, I need to become an OAuth provider. You can create a Zapier Application for your WebApp and give that to Zapier. Then, your users that also have Zapier accounts can go to Zapier and choose to make a ‘Zap’ with the Zapier Application you created.
I wanted to switch to i3, a tiling window manager. There are some things that are better, and some things that I miss. I run Ubuntu 16.04 LTS and Unity provides a lot of functionality that makes using the computer effortless. The motivation for using i3 is that, while developing, I end up tiling all my windows in separate workspaces anyways. It seems like the natural functionality of i3 better fits my use-case.
I recently discovered that changing the size of and EC2 instance does not destroy the EBS volume associated with it. Let that sink in. You have a computer sitting on your desk serving your website. You realize that you’re spending too much money on electricity for this computer, so while it is running, you rip the hard drive out and put it in another computer. In 5 seconds, your website is still running.
Background I needed to get a Wordpress site working for my side project. The installation process is all fine and dandy, until you want to do the right thing and enforce TLS for your site. I ran in to issues with mixed-content errors. Wordpress was requesting jQuery and some CSS files over the insecure routes to my domain. Attempted Solutions I had a redirect block in my nginx configuration, but that did not seem to solve the issue.
Detail and SSH connection Symmetrical Encryption One key can be used to encrypt messages Alice->Bob but can also be used to decrypt messages Bob->Alice. Anyone that holds the key, can encrypt and decrypt messages. AKA “Shared Secret” || “Secret Key” SSH uses Symmetrical encryption for the connection contrary to what most people believe (asymmetric). Asymmetric encryption is only used for authentication. Key Exchange Algorithm Using an algorithm, the client and server can exchange data over the Asymmetrically encrypted connection to arrive at a shared secret that can be used for the Symmetrically Encrypted connection.
Hey hey! I got it to work! I’ll use this post to log what I had to do to get it to work. Motivation I wanted a site for my personal use. I started off with a full stack deployed on AWS, but when your trial runs out, you can get hit with a pretty hefty usage. [ That’s quite a bit of money for something that I really wanted as a note store.