-
10 votes
-
BlueStacks X is a new and free way to play Android games in your browser
8 votes -
Ighor July "unlocks" GeForce Now
9 votes -
As it turns out, “Netflix Gaming” isn’t a streaming service
7 votes -
Xbox and Xbox Game Pass are coming to more screens
7 votes -
Do you use game streaming services? Which ones and why or why not?
I wanted to get a general discussion going on the opinions of game streaming services. This is a potentially huge market and the big companies out there are really trying to break into this...
I wanted to get a general discussion going on the opinions of game streaming services. This is a potentially huge market and the big companies out there are really trying to break into this market. I personally use google stadia and love it, there is a slight amount of latency in movements but it feels no different than a larger dead zone to me.
I love the idea of game streaming as it brings more games to more platforms like Linux, macOS and mobile devices. I know the big argument against them is that you don’t own the games, but from my perspective, you don’t own the games on steam either, you own the right to play someone else’s game just like with Google stadia or Luna or xcloud. If you want to own an actual copy then you have to buy the game from a vendor like gog or itch.io.
So let me know your opinions on this market, do you think it’s good, bad, or somewhere in between and why? If you play on any of these services what are your thoughts and experiences? Has it worked well for you and do you see yourself using services like this in the future? I genuinely am curious as it’s a completely different mindset than what we’re used too and it can really disrupt a market that hasn’t seen proper innovation in years.
13 votes -
LittleBigPlanet has been near-unplayable for a long time, and no one's said why
14 votes -
Finnish telecoms giant Nokia is to axe between 5,000 and 10,000 jobs worldwide in the next two years as it cuts costs
7 votes -
Stadia developers can't fix the bugs in their own game because Google fired them
13 votes -
Microsoft xCloud for Web - First look
3 votes -
Google Stadia shuts down internal studios, changing business focus
24 votes -
Stadia offering a free Premiere Edition bundle for US YouTube Premium members
7 votes -
In which a foolish developer tries DevOps: critique my VPS provisioning script!
I'm attempting to provision two mirror staging and production environments for a future SaaS application that we're close to launching as a company, and I'd like to get some feedback on the...
I'm attempting to provision two mirror staging and production environments for a future SaaS application that we're close to launching as a company, and I'd like to get some feedback on the provisioning script I've created that takes a default VPS from our hosting provider, DigitalOcean, and readies it for being a secure hosting environment for our application instance (which runs inside Docker, and persists data to an unrelated managed database).
I'm sticking with a simple infrastructure architecture at the moment: A single VPS which runs both nginx and the application instance inside a containerised docker service as mentioned earlier. There's no load balancers or server duplication at this point. @Emerald_Knight very kindly provided me in the Tildes Discord with some overall guidance about what to aim for when configuring a server (limit damage as best as possible, limit access when an attack occurs)—so I've tried to be thoughtful and integrate that paradigm where possible (disabling root login, etc).
I’m not a DevOps or sysadmin-oriented person by trade—I stick to programming most of the time—but this role falls to me as the technical person in this business; so the last few days has been a lot of reading and readying. I’ll run through the provisioning flow step by step. Oh, and for reference, Ubuntu 20.04 LTS.
First step is self-explanatory.
#!/bin/sh # Name of the user to create and grant privileges to. USERNAME_OF_ACCOUNT= sudo apt-get -qq update sudo apt install -qq --yes nginx sudo systemctl restart nginx
Next, create my sudo user, add them to the groups needed, require a password change on first login, then copy across any provided authorised keys from the root user which you can configure to be seeded to the VPS in the DigitalOcean management console.
useradd --create-home --shell "/bin/bash" --groups sudo,www-data "${USERNAME_OF_ACCOUNT}" passwd --delete $USERNAME_OF_ACCOUNT chage --lastday 0 $USERNAME_OF_ACCOUNT HOME_DIR="$(eval echo ~${USERNAME_OF_ACCOUNT})" mkdir --parents "${HOME_DIR}/.ssh" cp /root/.ssh/authorized_keys "${HOME_DIR}/.ssh" chmod 700 ~/.ssh chmod 600 ~/.ssh/authorized_keys chown --recursive "${USERNAME_OF_ACCOUNT}":"${USERNAME_OF_ACCOUNT}" "${HOME_DIR}/.ssh" sudo chmod 775 -R /var/www sudo chown -R $USERNAME_OF_ACCOUNT /var/www rm -rf /var/www/html
Installation of docker, and run it as a service, ensure the created user is added to the docker group.
sudo apt-get install -qq --yes \ apt-transport-https \ ca-certificates \ curl \ gnupg-agent \ software-properties-common curl -fsSL https://download.docker.com/linux/ubuntu/gpg | sudo apt-key add - sudo apt-key fingerprint 0EBFCD88 sudo add-apt-repository --yes \ "deb [arch=amd64] https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) \ stable" sudo apt-get -qq update sudo apt install -qq --yes docker-ce docker-ce-cli containerd.io # Only add a group if it does not exist sudo getent group docker || sudo groupadd docker sudo usermod -aG docker $USERNAME_OF_ACCOUNT # Enable docker sudo systemctl enable docker sudo curl -L "https://github.com/docker/compose/releases/download/1.27.4/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose sudo chmod +x /usr/local/bin/docker-compose sudo ln -s /usr/local/bin/docker-compose /usr/bin/docker-compose docker-compose --version
Disable root logins and any form of password-based authentication by altering
sshd_config
.sed -i '/^PermitRootLogin/s/yes/no/' /etc/ssh/sshd_config sed -i '/^PasswordAuthentication/s/yes/no/' /etc/ssh/sshd_config sed -i '/^ChallengeResponseAuthentication/s/yes/no/' /etc/ssh/sshd_config
Configure the firewall and fail2ban.
sudo ufw default deny incoming sudo ufw default allow outgoing sudo ufw allow ssh sudo ufw allow http sudo ufw allow https sudo ufw reload sudo ufw --force enable && sudo ufw status verbose sudo apt-get -qq install --yes fail2ban sudo systemctl enable fail2ban sudo systemctl start fail2ban
Swapfiles.
sudo fallocate -l 1G /swapfile && ls -lh /swapfile sudo chmod 0600 /swapfile && ls -lh /swapfile sudo mkswap /swapfile sudo swapon /swapfile && sudo swapon --show echo '/swapfile none swap sw 0 0' | sudo tee -a /etc/fstab
Unattended updates, and restart the ssh daemon.
sudo apt install -qq unattended-upgrades sudo systemctl restart ssh
Some questions
You can assume these questions are cost-benefit focused, i.e. is it worth my time to investigate this, versus something else that may have better gains given my limited time.
- Obviously, any critiques of the above provisioning process are appreciated—both on the micro level of criticising particular lines, or zooming out and saying “well why don’t you do this instead…”. I can’t know what I don’t know.
- Is it worth investigating tools such as
ss
orlynis
(https://github.com/CISOfy/lynis) to perform server auditing? I don’t have to meet any compliance requirements at this point. - Do I get any meaningful increase in security by implementing 2FA on login here using google authenticator? As far as I can see, as long as I'm using best practices to actually
ssh
into our boxes, then the likeliest risk profile for unwanted access probably isn’t via the authentication mechanism I use personally to access my servers. - Am I missing anything here? Beyond the provisioning script itself, I adhere to best practices around storing and generating passwords and ssh keys.
Some notes and comments
- Eventually I'll use the hosting provider's API to spin up and spin down VPS's on the fly via a custom management application, which gives me an opportunity to programmatically execute the provisioning script above and run some over pre- and post-provisioning things, like deployment of the application and so forth.
- Usage alerts and monitoring is configured within DigitalOcean's console, and alerts are sent to our business' Slack for me to action as needed. Currently, I’m settling on the following alerts:
- Server CPU utilisation greater than 80% for 5 minutes.
- Server memory usage greater than 80% for 5 minutes.
- I’m also looking at setting up daily fail2ban status alerts if needed.
9 votes -
PAC-MAN Mega Tunnel Battle demo - Google Stadia
5 votes -
IBM to break up 109-year old company to focus on cloud growth
18 votes -
EU shoots for €10B ‘industrial cloud’ to rival US
7 votes -
Cloud gaming’s history of false starts and promising reboots
5 votes -
Microsoft is bringing xCloud to iOS via the web
5 votes -
The new Chromecast with Google TV won’t officially support Stadia at launch
5 votes -
Amazon announces Luna cloud gaming service
6 votes -
App Store review guidelines on streaming games
12 votes -
Geforce NOW Beta on Chromebook - play.geforcenow.com
6 votes -
Five ways cloud-native application testing is different from testing on-premises software
4 votes -
Apple won't allow game streaming services like xCloud and Stadia into the App Store
20 votes -
Neocortix Announces Arm 64-bit Support for Folding@home and Rosetta@home COVID-19 Vaccine Research
4 votes -
7 Aspects of IT Certifications
1 vote -
Microsoft's decision to bundle xCloud as part of Games Pass Ultimate shows how game streaming's role could be a complement instead of competition
3 votes -
Amazon and Google are in games for the wrong reasons
10 votes -
Razer’s Kishi turns your phone into a Nintendo Switch lookalike that can play Google Stadia
5 votes -
How Sega hopes to use Japanese arcades as streaming data centers
5 votes -
Steam Cloud Play (Beta) appears in partner documentation
8 votes -
DigitalOcean introduces Virtual Private Cloud (VPC) networks
5 votes -
Oracle wins cloud computing deal with Zoom as video calls surge
8 votes -
Stadia version of Doom Eternal's lag re-tested, plus tests of The Division 2, Borderlands 3, Ghost Recon Breakpoint, and more
5 votes -
Games from Warner Bros. Interactive Entertainment, XBOX Game Studios, Codemasters and Klei Entertainment will be removed from GeForce Now on April 24
7 votes -
A two-month free trial of Stadia Pro with nine games is now available
7 votes -
Microsoft: Cloud services demand up 775 percent; prioritization rules in place
4 votes -
Google now giving away three months of Stadia access to new Chromecast buyers
9 votes -
Google Stadia announces five upcoming games, including three "First on Stadia" titles
8 votes -
Nvidia's GeForce Now streaming service leaves beta - Uses many existing game purchases, supports ray-tracing, and has a time-limited free trial
12 votes -
Google Stadia announces plans to add over 120 games this year, including over ten exclusives
17 votes -
DigitalOcean is laying off staff, sources say thirty to fifty affected
10 votes -
Google leadership set 2023 as deadline to beat Amazon and Microsoft in the cloud business
6 votes -
Prime leverage: How Amazon wields power in the technology world
5 votes -
"Randomizers" are breathing new life into old games
18 votes -
Google Stadia - 4K image quality analysis and latency tests
11 votes -
CNET reports Amazon is working on a game streaming competitor to Google Stadia
7 votes -
Steam Remote Play Together is now in beta - A new feature that lets you play your couch co-op games with friends over the internet
19 votes -
Digital Foundry's Google Stadia tech review: The best game streaming yet, but far from ready
8 votes -
Google Stadia will be missing many features for Monday’s launch
9 votes