-
17 votes
-
Cloud Servers for the Broke
Just wanted to put this out there as a little PSA in case it's helpful: if you want a cloud server but don't wanna pay anything, Oracle's Free Tier is a life saver. Discovered it a year ago and...
Just wanted to put this out there as a little PSA in case it's helpful: if you want a cloud server but don't wanna pay anything, Oracle's Free Tier is a life saver. Discovered it a year ago and couldn't be happier I did, since I'd never pay for cloud computing otherwise š.
Quick Specs:
For free you get:
- 24/7 uptime
- 200gb of storage space
- 24GB of RAM
- 4 OCPUs
- 4 Gbps Bandwidth
That's been more than enough for me and honestly feels too good to be true. Some things I've done with this:
- Minecraft Server
- Radarr/Sonarr Plex Setup
- I'd like to make my own backup solution as well!
If anyone has any other ideas for cool projects I could self host, please do tell I'm curious what else I could do :)
48 votes -
Windows could become cloud based in the future
16 votes -
Apple tests āApple GPT,ā develops generative AI tools to catch OpenAI
17 votes -
Why do cloud providers keep building datacenters in America's hottest city?
33 votes -
InfluxDB has apparently shut down - and deleted! - two of its data centers and some customers did not get any warning
23 votes -
Cloud Native Software Engineering
3 votes -
Microsoft wants to move Windows fully to the cloud
72 votes -
Microsoft's $68.7bn (Ā£55bn) deal to buy US video game company Activision Blizzard has been blocked in the UK by the Competition and Markets Authority
13 votes -
A gift from the Stadia team & Bluetooth controller functionality info
14 votes -
Ankerās Eufy lied to us about the security of its security cameras. Despite claims of only using local storage, Eufy has been uploading identifiable footage to the cloud.
18 votes -
Stadia is shutting down
38 votes -
Stack Overflow trends: Weekday vs weekend site activity
5 votes -
A dad took photos of his naked toddler for the doctor. Google flagged him as a criminal.
14 votes -
Broadcom announces plans to buy VMware in $61 billion deal
16 votes -
All-new PlayStation Plus tiers launches in June
4 votes -
Analysis by computer science professor shows that "Google Phone" and "Google Messages" send data to Google servers without being asked and without the user's knowledge, continuously
11 votes -
Google Stadia has reportedly been demoted
21 votes -
How I got pwned by my cloud costs
14 votes -
A look back at Q3 '21 public cloud software earnings
3 votes -
PlayStation plans new service to take on Xbox Game Pass
5 votes -
AWS embraces Fedora Linux for its cloud-based Amazon Linux
5 votes -
Tell your hopes and experiences with cloud gaming
So I just upgraded to an M1 Mac Mini. I was a little iffy on it, part of me wanted to build a PC just to play games but I really like MacOS and I mostly play on PS5 and the Switch with the PC only...
So I just upgraded to an M1 Mac Mini. I was a little iffy on it, part of me wanted to build a PC just to play games but I really like MacOS and I mostly play on PS5 and the Switch with the PC only being for indie titles and stuff that only works with a keyboard and mouse like RTS, 4x, or city builders. I just don't play PC games enough to prioritize gaming as a use case in buying a computer, but I also really like RTS and city builder games.
I figured WINE and Parallels would meet most of my gaming needs but my forays into WINE have been frustrating and buggy, and this reddit thread about what works on Parallels is, frankly, just kind of sad to look at. What's worse, apparently the new Age of Empires has some kind of pathfinding instruction set that ONLY works with x86 architecture. So it won't work under any kind of virtualization or emulation.
Enter Cloud gaming. It seems the big contenders right now are ShadowPC, GeForce Now, and Paperspace. Has anyone tried these? When I last costed these out Shadow was only around $15-$20 a month which was almost a no-brainer. But it seems to have gone up to $30 a month now, which gets costly enough to where it almost seems like I'd rather get a Steam Deck. Paperspace is like $10 per month plus another ~$1 per hour of play, which would probably end up cheapest for how little I play. But how it is in terms of configuration and latency I have no idea.
7 votes -
GeForce Now cloud gaming service adds new RTX 3080 membership tier, supporting streaming at up to 1440p and 120 FPS
10 votes -
BlueStacks X is a new and free way to play Android games in your browser
8 votes -
Ighor July "unlocks" GeForce Now
9 votes -
As it turns out, āNetflix Gamingā isnāt a streaming service
7 votes -
Xbox and Xbox Game Pass are coming to more screens
7 votes -
Do you use game streaming services? Which ones and why or why not?
I wanted to get a general discussion going on the opinions of game streaming services. This is a potentially huge market and the big companies out there are really trying to break into this...
I wanted to get a general discussion going on the opinions of game streaming services. This is a potentially huge market and the big companies out there are really trying to break into this market. I personally use google stadia and love it, there is a slight amount of latency in movements but it feels no different than a larger dead zone to me.
I love the idea of game streaming as it brings more games to more platforms like Linux, macOS and mobile devices. I know the big argument against them is that you donāt own the games, but from my perspective, you donāt own the games on steam either, you own the right to play someone elseās game just like with Google stadia or Luna or xcloud. If you want to own an actual copy then you have to buy the game from a vendor like gog or itch.io.
So let me know your opinions on this market, do you think itās good, bad, or somewhere in between and why? If you play on any of these services what are your thoughts and experiences? Has it worked well for you and do you see yourself using services like this in the future? I genuinely am curious as itās a completely different mindset than what weāre used too and it can really disrupt a market that hasnāt seen proper innovation in years.
13 votes -
LittleBigPlanet has been near-unplayable for a long time, and no one's said why
14 votes -
Finnish telecoms giant Nokia is to axe between 5,000 and 10,000 jobs worldwide in the next two years as it cuts costs
7 votes -
Stadia developers can't fix the bugs in their own game because Google fired them
13 votes -
Microsoft xCloud for Web - First look
3 votes -
Google Stadia shuts down internal studios, changing business focus
24 votes -
Stadia offering a free Premiere Edition bundle for US YouTube Premium members
7 votes -
In which a foolish developer tries DevOps: critique my VPS provisioning script!
I'm attempting to provision two mirror staging and production environments for a future SaaS application that we're close to launching as a company, and I'd like to get some feedback on the...
I'm attempting to provision two mirror staging and production environments for a future SaaS application that we're close to launching as a company, and I'd like to get some feedback on the provisioning script I've created that takes a default VPS from our hosting provider, DigitalOcean, and readies it for being a secure hosting environment for our application instance (which runs inside Docker, and persists data to an unrelated managed database).
I'm sticking with a simple infrastructure architecture at the moment: A single VPS which runs both nginx and the application instance inside a containerised docker service as mentioned earlier. There's no load balancers or server duplication at this point. @Emerald_Knight very kindly provided me in the Tildes Discord with some overall guidance about what to aim for when configuring a server (limit damage as best as possible, limit access when an attack occurs)āso I've tried to be thoughtful and integrate that paradigm where possible (disabling root login, etc).
Iām not a DevOps or sysadmin-oriented person by tradeāI stick to programming most of the timeābut this role falls to me as the technical person in this business; so the last few days has been a lot of reading and readying. Iāll run through the provisioning flow step by step. Oh, and for reference, Ubuntu 20.04 LTS.
First step is self-explanatory.
#!/bin/sh # Name of the user to create and grant privileges to. USERNAME_OF_ACCOUNT= sudo apt-get -qq update sudo apt install -qq --yes nginx sudo systemctl restart nginx
Next, create my sudo user, add them to the groups needed, require a password change on first login, then copy across any provided authorised keys from the root user which you can configure to be seeded to the VPS in the DigitalOcean management console.
useradd --create-home --shell "/bin/bash" --groups sudo,www-data "${USERNAME_OF_ACCOUNT}" passwd --delete $USERNAME_OF_ACCOUNT chage --lastday 0 $USERNAME_OF_ACCOUNT HOME_DIR="$(eval echo ~${USERNAME_OF_ACCOUNT})" mkdir --parents "${HOME_DIR}/.ssh" cp /root/.ssh/authorized_keys "${HOME_DIR}/.ssh" chmod 700 ~/.ssh chmod 600 ~/.ssh/authorized_keys chown --recursive "${USERNAME_OF_ACCOUNT}":"${USERNAME_OF_ACCOUNT}" "${HOME_DIR}/.ssh"āØāØsudo chmod 775 -R /var/www sudo chown -R $USERNAME_OF_ACCOUNT /var/www rm -rf /var/www/html
Installation of docker, and run it as a service, ensure the created user is added to the docker group.
sudo apt-get install -qq --yes \ apt-transport-https \ ca-certificates \ curl \ gnupg-agent \ software-properties-common curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - sudo apt-key fingerprint 0EBFCD88 sudo add-apt-repository --yes \ "deb [arch=amd64] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) \ stable" sudo apt-get -qq update sudo apt install -qq --yes docker-ce docker-ce-cli containerd.io # Only add a group if it does not exist sudo getent group docker || sudo groupadd docker sudo usermod -aG docker $USERNAME_OF_ACCOUNT # Enable docker sudo systemctl enable docker sudo curl -L "https://github.com/docker/compose/releases/download/1.27.4/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose sudo chmod +x /usr/local/bin/docker-compose sudo ln -s /usr/local/bin/docker-compose /usr/bin/docker-compose docker-compose --version
Disable root logins and any form of password-based authentication by altering
sshd_config
.sed -i '/^PermitRootLogin/s/yes/no/' /etc/ssh/sshd_config sed -i '/^PasswordAuthentication/s/yes/no/' /etc/ssh/sshd_config sed -i '/^ChallengeResponseAuthentication/s/yes/no/' /etc/ssh/sshd_config
Configure the firewall and fail2ban.
sudo ufw default deny incoming sudo ufw default allow outgoing sudo ufw allow ssh sudo ufw allow http sudo ufw allow https sudo ufw reload sudo ufw --force enable && sudo ufw status verbose sudo apt-get -qq install --yes fail2ban sudo systemctl enable fail2ban sudo systemctl start fail2ban
Swapfiles.
sudo fallocate -l 1G /swapfile && ls -lh /swapfile sudo chmod 0600 /swapfile && ls -lh /swapfile sudo mkswap /swapfile sudo swapon /swapfile && sudo swapon --show echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab
Unattended updates, and restart the ssh daemon.
sudo apt install -qq unattended-upgrades sudo systemctl restart ssh
Some questions
You can assume these questions are cost-benefit focused, i.e. is it worth my time to investigate this, versus something else that may have better gains given my limited time.
- Obviously, any critiques of the above provisioning process are appreciatedāboth on the micro level of criticising particular lines, or zooming out and saying āwell why donāt you do this insteadā¦ā. I canāt know what I donāt know.āØ
- Is it worth investigating tools such as
ss
orlynis
(https://github.com/CISOfy/lynis) to perform server auditing? I donāt have to meet any compliance requirements at this point.⨠- Do I get any meaningful increase in security by implementing 2FA on login here using google authenticator? As far as I can see, as long as I'm using best practices to actually
ssh
into our boxes, then the likeliest risk profile for unwanted access probably isnāt via the authentication mechanism I use personally to access my servers.⨠- Am I missing anything here? Beyond the provisioning script itself, I adhere to best practices around storing and generating passwords and ssh keys.
Some notes and comments
- Eventually I'll use the hosting provider's API to spin up and spin down VPS's on the fly via a custom management application, which gives me an opportunity to programmatically execute the provisioning script above and run some over pre- and post-provisioning things, like deployment of the application and so forth.āØ
- Usage alerts and monitoring is configured within DigitalOcean's console, and alerts are sent to our business' Slack for me to action as needed. Currently, Iām settling on the following alerts:
- Server CPU utilisation greater than 80% for 5 minutes.
- Server memory usage greater than 80% for 5 minutes.
- Iām also looking at setting up daily fail2ban status alerts if needed.
9 votes -
PAC-MAN Mega Tunnel Battle demo - Google Stadia
5 votes -
IBM to break up 109-year old company to focus on cloud growth
18 votes -
EU shoots for ā¬10B āindustrial cloudā to rival US
7 votes -
Cloud gamingās history of false starts and promising reboots
5 votes -
Microsoft is bringing xCloud to iOS via the web
5 votes -
The new Chromecast with Google TV wonāt officially support Stadia at launch
5 votes -
Amazon announces Luna cloud gaming service
6 votes -
App Store review guidelines on streaming games
12 votes -
Geforce NOW Beta on Chromebook - play.geforcenow.com
6 votes -
Five ways cloud-native application testing is different from testing on-premises software
4 votes -
Apple won't allow game streaming services like xCloud and Stadia into the App Store
20 votes -
Neocortix Announces Arm 64-bit Support for Folding@home and Rosetta@home COVID-19 Vaccine Research
4 votes -
7 Aspects of IT Certifications
1 vote -
Microsoft's decision to bundle xCloud as part of Games Pass Ultimate shows how game streaming's role could be a complement instead of competition
3 votes