Home automation with Apple Homekit, awesome or just another platform ?

Hello everyone, it’s been a while (understatement of the century), and this time I’m back with a post about home automation and Homekit. Ever since the launch of homekit I have been curious about this and when I bought my home I decided to go down this path. And of course being a techie this is all of great interest to me, so I hope it interests you too. This is just a post about what I got to implement and how my experience was, more detailed posts about the products will come soon I promise

What I ended up implementing

These are all great products and they would all deserve a post each but for now I am just going to talk about my overall experience with homekit and these devices.

Homekit hubs

Since we have several Apple devices at my place we are using a different number of hubs, Apple TV 4K, Homepod (1st gen) and Homepod Mini. They all take turns as active/standby devices and so far they have worked quite well, at the moment and since iOS 16.3 the Homepod mini even works as a temperature/humidity sensor which is great. My advice to anyone out there would be try to buy smart devices that don’t lock you down to a single ecosystems (be it Apple, Google or Amazon) , and with the introduction of protocols like Matter this shouldn’t be a problem anymore, or actually it shouldn’t matter which ecosystem you use, (pun intended) .

Ikea Dirigera hub

This is not really 100% required to use Ikea smart home products, however it becomes a necessity once you want to do anything beyond the very basic stuff. The integration with homekit is very good in terms of functionality and the process of pairing the different devices has come a long way since the early days of Ikea smart home, and now works flawlessly. Automations are easy to set up and seem to be very stable.

Ikea Fyrtur smart blinds

This product has really impressed me, and I have to say that Ikea’s idea here has been quite good, they produced a inexpensive product, that works consistently and does what’s intended to do. The battery life on these things has been impressive, at the time of writing this post, they have been working for almost 3 months and most of them are still at around 80% battery. This is really good in my opinion, and the fact that Ikea also sells replacement batteries also makes me think these were a good investment. One small criticism of this product is that, they are a bit noisy … it won’t wake up your neighbours but I feel they could be a bit quieter.

Airversa smart air purifier

This is one of my favourite products, quiet and easy to use, the Airversa smart air purifier has been a workhorse around our house. With a silent but efficient fan it has made our home a better place to live and breathe. The filters are quite cheap and they last a long time, this helpful friend will only make you aware of it’s presence when it has hard work to do, but even when that happens it quickly goes back to being as silent as usual. And with a cat around the house this really is a must for us.

Tado wired smart thermostat

This is one of the toys I am most enthusiastic about at the moment. Having just installed it yesterday, the Tado thermostat is already working hard saving me money on my heating bills (and we all know how much that is needed right now) and with time, the other products from Tado which I plan on acquiring will become some of the most useful pieces of home tech I think I will ever own. Geofencing is amazing (detects when nobody is home and shuts down your boiler) and the simple and intuitive user interface works like a charm. I think even my 6yo son could operate this. Installing was as easy as connecting two wires, and I plan on writing another post soon about the other products on the Tado catalog I plan on acquiring. One thing I really like is that you don’t really need to have an account with them, and you can run your smart heating from your home app in case you are an Apple user, and I’m sure it works in a similar way with other smart home platforms.

Conclusion

So far my adventures with Homekit, have been quite positive. The workflow for adding new devices has been easy with not much faffing around to get things to work. Automations seem to be good too, even if they are a bit simple. I for one would like to see “conditionals” implemented, for example it would be good to be able to write an automation like “At this time of day, if temperature outside is lower than X, then do Y” but I guess the current form also works I just wish we could have more control.

Anyway I hope you enjoyed this post, and please come back soon for more posts about this Smart Home stuff, I promise I won’t be gone for long.

Hashicorp Vault , keeping things secret – Part 1 install and configure Vault

Hello folks

Today I am going to talk about a great tool from Hashicorp, it’s called Vault (https://www.vaultproject.io) . This piece of software is in my opinion an essential part it our Devops toolkit. It allows you to safely save, and dynamically generate secrets for your infrastructure. The main use case for me has been to allow me to set passwords in Terraform templates, without exposing the password itself in clear text,  but for now what I am talking about here is how to install it and get it up and running in your Linux Server, and I will also provide a Dockerfile you can use to spin up a container and play around with vault.

So let’s get our hands dirty

1 – Download your binary from https://www.vaultproject.io/downloads.html , you will see links for MacOS, Linux, BSD etc, choose your own . For example

#wget https://releases.hashicorp.com/vault/0.8.1/vault_0.8.1_linux_amd64.zip

2 – Uncompress the archive and copy the file into a directory in $PATH

#unzip vault_0.8.1_linux_amd64.zip ; cp vault /usr/local/bin

3 – Test that tou can execute vault.

#vault -v

this should return something like

#vault -v
Vault v0.8.1 (‘8d76a41854608c547a233f2e6292ae5355154695’)

Of course your values could be different as newer versions are released. You should also do the same thing on your workstation as the same binary is used for the client too. So download it to your computer expand the zip file and copy it to a location of your choice (as long as that location is in $PATH).

So , now we have vault in place and we can start the server, this can be done by using the command

#vault server -config <PATH_TO_CONFIG_FILE>

output will look something like


==> Vault server configuration:

Cgo: disabled
Listener 1: tcp (addr: “0.0.0.0:8200”, cluster address: “0.0.0.0:8201”, tls: “disabled”)
Log Level: info
Mlock: supported: true, enabled: true
Storage: file
Version: Vault v0.8.1
Version Sha: 8d76a41854608c547a233f2e6292ae5355154695

==> Vault server started! Log data will stream in below:

Please make sure your firewall allows port 8200/8201 TCP to this server.

If you want to use Vault inside a docker container you can checkout this git repo , as it contains a Dockerfile and an example configuration file for Vault.

https://github.com/ruimoreira/blogexamples

Ok so now we have vault running. Lets initialize it , I would advise you execute this on your workstation,

export VAULT_ADDR='http://<VAULT_SERVER_IP_ADDRESS>:8200'

Let’s check that we can actually reach it

# vault status
Error checking seal status: Error making API request.

URL: GET http://127.0.0.1:8200/v1/sys/seal-status
Code: 400. Errors:

* server is not yet initialized

so this tells us that we can indeed reach the server, however it’s not initialized.

So let’s do just that.

#vault init


#vault init
Unseal Key 1: IIeMHIGq+xmIDqXN7Q43Lt7nmi5sLvNad5NgUjOVPoiA
Unseal Key 2: phNTpSyjBqobHYeLVOfiaUHQ6iidw2/BowKnTb3HzaC4
Unseal Key 3: jJcuYrSiQRHv0TvD1/AVrHBpd2f6mjtjriGLa66A2O5b
Unseal Key 4: so5WqFp1nmXFeuLE4tUZiglCTEBP2gkc9/teNZNvVOmz
Unseal Key 5: hCNg3wwVfYY/x0A6TLVmvyKyutilr5qvhkiH4mUDHWXR
Initial Root Token: 880fde6a-f672-fe8e-50d0-2e51f566654a

As we can see vault has provided the Unseal Keys, and the Root token to authenticate with.

At this point you need to unseal the vault, and you need to provide 3 keys using vault unseal.

Hope you find this useful and hope to see you again soon.

 

Rui Moreira

PS: If you are using a docker container to play around with Vault, the I would like to remind you to use the -p option to expose the port of the container you are running vault on.

More information here

WordPress Security and Selinux

Hello all,

Here we are again, today I am going to address WordPress, and the security of said platform, and how it could relate to Selinux, and SELinux in general.

Many have claimed that wordpress is in itself a flawed plattform and inherently insecure, I could not disagree more. Of course as with every web application there are a lot of security aspects to be taken into account and this should not be taken lightly. In my opinion there are a few things we could do to increase the security of our wordpress install and our website’s reputation intact or at least not at permanent risk. As most of you know I have in the past worked for one of the most famous hosting companies in the world, the Birthplace of Openstack and overal an awesome place to work at, and during my carreer there I have seen a lot of nonsense around wordpress installs, well the same could be said about several other CMS plattforms but I have to admit that WordPress is one of the most famous, amongst other things I have seen the following errors, but however there are others, these are just the most common :

1 – apache user owning the document root

2 – chmod 777 on the document root

3 – features developed that prevent wordpress or php from being updated

4 – outdated or exploitable plugins

5 – Selinux disabled

Let’s look at each one of them indivudually ,

1 – apache user owning the document root

This is very dangerous and will allow an attacker to exploit possible failures in wordpress to write to the document root of your website, the attacker will then use the file he or she uploads (usually a malicious script) to eventually download some sort of php shell (a piece of software that mimics the functions a shell would implement) and try to take control of your web server and website.

While installing wordpress take into attention the file permissions and the recommendations in the WordPress codex (https://codex.wordpress.org) regarding file permissions. Basically apache needs to be able to write to wp-content/uploads and not much more. If you plan on updating wordpress via the ftp feature, please create a user for that process that user could own the entire document root with apache only to read the files. Also if you are doing this please resctict the ftp process to localhost, as ftp is somewhat insecure.

2 – chmod 777 on the document root

This one is a big no … if your website has chmod 777 in the document root I would suggest that you place it under maintenance and review all the file permissions to what they should be. again the wordpress codex is a good place to start.

3 – features developed that prevent wordpress or php from being updated

This one is my pet peeve, it should never happen your developer should test if his code will still work in the case of a upgrade on wordpress at least. We could argue that if some function gets deprecated in a php version upgrade and that breaks your code, you should in my opinion pay your developer to perform the correct fixes. This might sound like I am parcial to developers, however I do insist that they should not work for free (nobody should really) and if your website’s reputation is important for you then you need to agree with him on  how to proceed. But the wordpress version upgrade should be starting point even before your website goes to production .

4 – outdated or exploitable plugins

This one is really a must, as I have said keeping wordpress and it’s pluggins updated is vital for the security of your website.

5 – Selinux disabled

In my opinion we should never have SELinux disabled, there are always ways to make it work and with the tools we have available (audit2allow, audit2why , etc ) there is no reason why you should have it disabled, instead learn how to use it, embrace it, at least if you are serious about security and about the website you are hosting. If you are not then you have no business hosting a website in the first place .

On my next post I will talk about ways you can improve the performance and perform some basic hardening of your wordpress instance.

Until then … stay safe !

 

Cheers

Rui

 

Amarone della Valpolicella Classico Tedeschi 2012

Hello all

So as per my previous post I ordered some  Amarone della Valpolicella Classico Tedeschi 2012 from vivino, the website / app that allows you to review wines and also share your review with other people. So here I am to review it, this will not only focus on the wine itself but also on Vivino’s service. First of all I am quite happy with them, the order process was quick and I got the wine delivered in 24h , (I live in London so this could be why) . As for the wine itself, it’s all the ad described, nice aroma, a very interesting taste and a wine worthy of the “Legendary” title. In my opinion one of the best Red wines I had so far.

I had this wine with a veggie Lasagna but it would go well with any meat dish, I think this would also go very well with my Lamb Stew, or any roast for that matter.

So I would say this wine is well worth the price and Vivino’s service was a good choice as I had the wine delivered to my door without any hassle, on time (actually a few days ahead of what I was expecting) and with a brilliant packaging and made sure the wine got to me intact.

Good work Vivino !!!

You will hear from me again, I promise !

Edit: The price I mentioned is with a Vivino discount, actual price may be higher without the discount.

 

Wine Reviews – The beginning

Hello all

So, while I was on vacation I installed a new app on my phone, it’s called Vivino and it’s main purpose is to allow you to review the wines you drink, simple enough and it also allows you to share your reviews with other people.

So now they are making some suggestions of their own, and have sent me an email with the following wine

https://uk.shops.vivino.com/legendary-amarone-della-valpolicella-classico-tedeschi-2012/

So this will be the first time I will use their service to order wine, and as soon as the wine arrives I will do an unboxing post to show you guys how the service is presented and also the wine per se.

Stay tuned for updates

 

Rui

Centos / Redhat Remove old kernel versions

Hello again

So today I started installing a few updates on one of my servers (centos 7) and had the following issue , I could not install the new kernel version because there was not enough disk space in /boot to accommodate the new kernel , so I went about removing the old files, and solving this problem permanently , so first I installed yum-utils

#yum install yum-utils

Then I used the package-cleanup utility (handy python script , yay python!!! )  that allows us to remove duplicate or orphaned packages.

Here is an example

#package-cleanup –oldkernels –count=2

so what this does is that it removes old kernels and keeps only the last 2 .

So this does what we want , which is to remove the older version of the kernel , however we might have this problem again in the future . Looking for a solution I have checked /etc/yum.conf and there we have the option

installonly_limit=5

according to Red Hat and their deployment guide present here

installonly_limit=value

…where value is an integer representing the maximum number of versions that can be installed simultaneously for any single package listed in the installonlypkgs directive.
The defaults for the installonlypkgs directive include several different kernel packages, so be aware that changing the value of installonly_limit will also affect the maximum number of installed versions of any single kernel package. The default value listed in /etc/yum.conf is installonly_limit=3, and it is not recommended to decrease this value, particularly below 2.

 

So , I changed the yum.conf to read

installonly_limit=2

There , we have avoided this problem.

Of course that this has an impact on how many old versions of the kernel you wish to keep , and I do recommend that you set this value taking into account your needs … but for me at this point I feel that 2 is enough .

 

 

 

 

Linux Academy … learning further and personal development

Hello all

As we all know a part of being a good Sysadmin is learning hot to reinvent yourself every once in a while. This can happen actually quite often as the tech market moves at lightning speed.  So this week I have started using Linux Academy to improve my Linux Mojo and learn a few new tricks  this is proving to be quite interesting as the website not only provides videos with the training materials but also provides you with labs , actual servers where you can test and learn further, I will document my progress as I go but so far it looks grand !!!

Beware that this is not free it costs around 25$/month but it’s in my opinion it’s well worth it if you can get well prepared for your certifications and learn new things .

Ouya … the little console that was

According to popular website phoronix Ouya Software has been acquired by Razer, news that is actually confirmed on their website http://www.razerzone.com/press/detail/press-releases/razer-acquires-ouya-software-assets . This is good news for the people working for Ouya  their game developers , and Ouya as a company , if this is good news for the users I am still to find out as a proud Ouya owner.

To quote Razer’s website

“In the near future, Razer will be providing existing OUYA users with a clear path of migration to the more advanced Forge TV micro-console and Serval controller bundle. Razer’s intention is to allow OUYA users to bring their games, controllers, and accounts to the Cortex TV platform on the Forge micro-console, advancing the experience of Android gaming on TV that they have previously enjoyed. Additionally, Razer is planning deep product discounts for incoming OUYA users to purchase Razer hardware, and a spate of freebies, giveaways, and promotions to enjoy on their new Forge consoles.  – See more at: http://www.razerzone.com/press/detail/press-releases/razer-acquires-ouya-software-assets#sthash.BQND5Iwu.dpuf”

I am not sure at this point what this means and I anxiously wait to hear from them 🙂

 

Neil Young and the streaming services

Hello Internet ,

So this whole thing about artists and streaming seems to continue ,  and it keeps getting sillier by the day , it was brought to my attention that the artist Neil Young has pulled his music from the streaming services .  I do support his decision on this , it is his work after all but some things are ridiculous … he has said and I quote “Streaming has ended for me. I hope this is ok for my fans. It’s not because of the money, although my share (like all the other artists) was dramatically reduced by bad deals made without my consent. It’s about sound quality. ”  source : https://www.facebook.com/NeilYoung …

It’s not because of the money, although my share (like all the other artists) was dramatically reduced by bad deals made without my consent. It’s about sound quality” …. this is funny because ,  if it’s not about the money why mention it ????

Also taking into account that a vast number of people these days listen to music on mobile devices or even any sort of digital device that plays mp3 compressed files … is this still an issue ? Or is this related to the fact that Mr Young has already developed his own “high quality” download service https://www.ponomusic.com and music store where they sell their own “ponoplayer” … (horrible  looking thing if you ask me )  ponoIn any case, I do think that his attitude is “funny” and odd because of the way this is said but … he does have the right to do this and …. I respect him for not just complaining but actually doing something to fight something that apparently he disagrees with .

In any case … respect to Neil for his courage … and may I ask if Mr Young is also going to remove all the videos from youtube  that feature his songs ?

Cheers

 

Rui

Another day … another Openssl Bug

Hello there

There has been a lot of noise around a new bug that openssl has reported , but it seems that this is being blown out of proportions (again) it seems that the bug was introduced on a commit from late April 2015 https://git.openssl.org/?p=openssl.git;a=commit;h=6281abc79623419eae6a64768c478272d5d3a426 and the versions if affects have been around for a month. So far the most used distros seem to have not been affected by this issue .

http://people.canonical.com/~ubuntu-security/cve/2015/CVE-2015-1793.html

https://access.redhat.com/solutions/1523323

 

So … nothing to see here … 😀