• Activity
  • Votes
  • Comments
  • New
  • All activity
  • Showing only topics in ~comp with the tag "web". Back to normal view / Search all groups
    1. Is it possible to build a sustainable image and video hosting service?

      The history of the web is littered with with many a dead image/video hosting service. Echos of their existence plague older forums in the form of broken links and images. It seems like they all...

      The history of the web is littered with with many a dead image/video hosting service. Echos of their existence plague older forums in the form of broken links and images. It seems like they all follow the same path, starting up as the new "simple" service that just hosts images, no fuss. But then as interest grows, so do costs, and the service owners have to scramble to monetize. Generally this is done by stuffing the place full of ads until everyone leaves. Alternatively the owners are stubborn and stick to their guns, until they inevitably have to shut down due to drowning in costs. When they do shut down, millions of assets are lost and the graveyard of broken images across the web grows some more.

      https://gfycat.com/ is the latest notable victim of this.

      With all the recent social media turmoil, there as been lots of exploration of alternative sites, and all of them have to overcome the problem of hosting media in one way or another.

      Tildes obviously does this by avoiding it entirely which, while a very effective solution, is just handballing the problem elsewhere. Users will still want to post images and videos but they will just have to find alternative hosts. Over time those hosts will die and Tildes posts will be filled with dead links.

      Mastodon has similar problems,the biggest cost of hosting a mastodon instance is the storage and bandwidth required to facilitate media posts. And there's a real danger of an instance incurring high costs if a particular post becomes popular and is hotlinked on a big centralised social media site.

      It seems like a really tricky problem to solve, something peer-to-peer could sort of solve the costs created by traffic peaks but has problems when there is many small files viewed by few individuals each.

      Are there any other solutions out there? Web3, IPFS? Or is it just not that much of a problem, do we accept that media on the web is ephemeral and will be lost after a while?

      80 votes
    2. The ideal backend language to write web apps in 2023?

      I know quite a controversial and opinionated question, one that might easily get blasted with downvotes on a site like StackOverflow or even Reddit! Nevertheless, one which I believe is still...

      I know quite a controversial and opinionated question, one that might easily get blasted with downvotes on a site like StackOverflow or even Reddit! Nevertheless, one which I believe is still relevant to ask and useful one even in 2023.

      The problem with backend web technologies is that we are overwhelmed with choices. Whilst getting spoilt with choices seems like a useful thing sometimes, it might easily be an impediment in decision making too. Based on my experience, there are a bunch of useful stacks and I will work on any of them if you pay me to work as a freelance coder. Each has its own pros and cons but I'm yet to find the ideal one which according to me is something that should be easy to code and deploy while also better performing at the same time.

      • ASP.NET: C# is the language I started coding web apps with in my last company and ASP.NET web forms was quite the rage back then. PHP was also gaining traction in the open source world and the webdev was mostly divided between the Enterprisey .NET aristocrats of Microsoft world and the poor PHP peasants of the FOSS world! One good thing about ASP.NET was performance. Since MS controlled the whole stack, they also put great efforts at making it work faster. The bad thing, of course, was dependence on a closed tech stack and a closed black box that generated JS functionality on its own.
      • PHP: When I resigned from that company and started freelancing, I came to know about open source, linux, XAMPP, etc. That was when I realized that my own attitudes and thinking was more attuned to the FOSS peasant mindset than the wealthy aristocrat's! I didn't earn quite as much in freelancing with WP, Drupal, SuiteCRM, CodeIgniter, etc. but I found great happiness and contentment in being part of the open source process. Till date, PHP remains my favorite language for backend development and most of my web projects involve CodeIgniter or even pure PHP.
      • Python: Flask is what got me interested in Python web development. The sheer minimalism and flexibility of that framework is what I found quite remarkable and quite a rarity in the frameworks world. And jinja2 template system was just fantastic. The other framework called django is more popular I think and I've worked on that too but Flask still remains my favorite. Flask is good in performance dept. too but I think it gets tricky once you start scaling with too many users.
      • Java: I've never really bothered with Java web development except a few tutorial experiments on the Apache TomEE server. The multi-layered approach that Java takes not only has very steep learning curve but unless you're a very gifted programmer, it's practically impossible to beat the performance of interpreted PHP/Flask!
      • NodeJS: Again, not much work here except brief hobby projects like http-live-simulator. The npm packaging system really turned me off initially with so many packages and issues with that system in the earlier days. Nowadays, I've heard that it's much usable but I've never gotten into it.

      And now, we also have the evolving languages like Golang, Rust, etc. taking their baby steps towards web development too! Are any of them worth giving a try? If someone were to ask you for a backend tech stack recommendation while giving equal weightage to performance, developer productivity and ease of deployment, which one will you suggest?

      23 votes
    3. Ffmpeg and AV1 for HTML5 streaming

      I've been looking around online at compatibility for HTML5 browser streaming. It looks like straight up AV1 in a MP4 container is becoming absolutely fine for browser playback on devices. Is...

      I've been looking around online at compatibility for HTML5 browser streaming. It looks like straight up AV1 in a MP4 container is becoming absolutely fine for browser playback on devices.

      Is anyone using this on webpages yet? The sooner we move to AV1, the sooner we can have high quality video stored at smaller file sizes, which is a massive bonus.

      Right now my company video hosting is purely in MP4 with H264, moov atom to the front as per the requirement, and it plays back on everything with no fallback in a straight HTML5 video container. What's the chance of switching to AV1 and not having to worry about the fallback for the most part?

      Edit: I should have used a better title. I used FFMpeg for MP4 and AV1 creation/encoding. This is more about HTML5 video container code and direct AV1 file playback.

      20 votes
    4. Web developers - What is your stack?

      As someone who is not mainly a web developer, I can barely grasp the immensity of options when it comes to writing a web application. So far everything I've written has been using PHP and the Slim...

      As someone who is not mainly a web developer, I can barely grasp the immensity of options when it comes to writing a web application.

      So far everything I've written has been using PHP and the Slim microframework. PHP because I don't use languages like Python/Ruby/JS that much so I didn't have any prior knowledge of those, and I've found myself to be fairly productive with it. Slim because I didn't want a full-blown framework with 200 files to configure.

      I've tried Go because I've used it in the past but I don't see it to be very fit when it comes to websites, I think it's fine for small microservices but doing MVC was a chore, maybe there's a framework out there that solves this.

      As for the frontend I've been trying to use as little JavaScript as possible, always vanilla. As of HTML and CSS I'm no designer so I kind of get by copying code and tweaking things here and there.

      However I've started a slightly bigger project and I don't fancy myself writing everything from scratch (specially security) besides, ORMs can be useful. Symfony4 is what I've been using for a couple of days, but I've had trouble setting up debugging, and the community/docs don't seem that great since this version is fairly new; so I'm considering trying out something more popular like Django.

      So this is why I created the post, I know this will differ greatly depending on the use-case. But I would like to do a quick survey and hear some of your recommendations, both on the backend and frontend. Besides I think it's a good topic for discussion.

      Cheers!

      20 votes
    5. 100s of tabs: what is there?

      Those of you who keep hundreds of tabs open: I'm curious how and why you use them. I'd hoard tabs in the past, but in a sad incident a browser (Firefox) restart caused the loss of all my 10s of...

      Those of you who keep hundreds of tabs open: I'm curious how and why you use them. I'd hoard tabs in the past, but in a sad incident a browser (Firefox) restart caused the loss of all my 10s of open tabs that was accumulated over weeks long research about a topic, I decided to never trust tabs again. Now I'm making use of my bookmars toolbar, Org mode and Instapaper for most of the stuff having many tabs open was the method before. So, for me, tabs were for keeping stuff handy during research, read-it-later lists, and temporary bookmarks. What are the use cases for you?

      19 votes
    6. Good resources for accessibility in web design/development?

      Hey there! Any web developers/designers out there that have resources on creating websites that are fully accessible? I am getting back into web development after a decade away and want to learn...

      Hey there! Any web developers/designers out there that have resources on creating websites that are fully accessible? I am getting back into web development after a decade away and want to learn the correct way. Thanks for any tips!

      16 votes
    7. Firefox 62 Nightlies: Improving DNS Privacy in Firefox

      Firefox recently introduced DNS over HTTPS (DoH) and Trusted Recursive Resolver (TRR) in nightly builds for Firefox 62. DoH and TRR are intended to help mitigate these potential privacy and...

      Firefox recently introduced DNS over HTTPS (DoH) and Trusted Recursive Resolver (TRR) in nightly builds for Firefox 62.

      DoH and TRR are intended to help mitigate these potential privacy and security concerns:

      1. Untrustworthy DNS resolvers tracking your requests, or tampering with responses from DNS servers.
      2. On-path routers tracking or tampering in the same way.
      3. DNS servers tracking your DNS requests.

      DNS over HTTPs (DoH) encrypts DNS requests and responses, protecting against on-path eavesdropping, tracking, and response tampering.

      Trusted Recursive Resolver (TRR) allows Firefox to use a DNS resolver that's different from your machines network settings. You can use any recursive resolver that is compatible with DoH, but it should be a trusted resolver (one that won't sell users’ data or trick users with spoofed DNS). Mozilla is partnering with Cloudflare (but not using the 1.1.1.1 address) as the initial default TRR, however it's possible to use another 3rd party TRR or run your own.

      Cloudflare is providing a recursive resolution service with a pro-user privacy policy. They have committed to throwing away all personally identifiable data after 24 hours, and to never pass that data along to third-parties. And there will be regular audits to ensure that data is being cleared as expected.

      Additionally, Cloudflare will be doing QNAME minimization where the DNS resolver no longer sends the full original QNAME (foo.bar.baz.example.com) to the upstream name server. Instead it will only include the label for the zone it's trying to resolve.

      For example, let's assume the DNS resolver is trying to find foo.bar.baz.example.com, and already knows that ns1.nic.example.com is authoritative for .example.com, but does not know a more specific authoritative name server.

      1. It will send the query for just baz.example.com to ns1.nic.example.com which returns the authoritative name server for baz.example.com.
      2. The resolver then sends a query for bar.baz.example.com to the nameserver for baz.example.com, and gets a response with the authoritative nameserver for bar.baz.example.com
      3. Finally the resolver sends the query for foo.bar.baz.example.com to bar.baz.example.com's nameserver.
        In doing this the full queried name (foo.bar.baz.example.com) is not exposed to intermediate name servers (bar.baz.example.com, baz.example.com, example.com, or even the .com root nameservers)

      Collectively DNS over HTTPs (DoH), Trusted Recursive Resolver (TRR), and QNAME Minimization are a step in the right direction, this does not fix DNS related data leaks entirely:

      After you do the DNS lookup to find the IP address, you still need to connect to the web server at that address. To do this, you send an initial request. This request includes a server name indication, which says which site on the server you want to connect to. And this request is unencrypted.
      That means that your ISP can still figure out which sites you’re visiting, because it’s right there in the server name indication. Plus, the routers that pass that initial request from your browser to the web server can see that info too.

      So How do I enable it?
      DoH and TRR can be enabled in Firefox 62 or newer by going to about:config:

      • Set network.trr.mode to 2
        • Here's the possible network.trr.mode settings:
          • 0 - Off (default): Use standard native resolving only (don't use TRR at all)
          • 1 - Race: Native vs. TRR. Do them both in parallel and go with the one that returns a result first.
          • 2 - First: Use TRR first, and only if the name resolve fails use the native resolver as a fallback.
          • 3 - Only: Only use TRR. Never use the native (after the initial setup).
          • 4 - Shadow: Runs the TRR resolves in parallel with the native for timing and measurements but uses only the native resolver results.
          • 5 - Off by choice: This is the same as 0 but marks it as done by choice and not done by default.
      • Set network.trr.uri to your DoH Server:
      • The DNS Tab on about:networking will show which names were resolved using TRR via DoH.

      Links:
      A cartoon intro to DNS over HTTPS
      Improving DNS Privacy in Firefox
      DNS Query Name Minimization to Improve Privacy
      TRR Preferences

      I'm not affiliated with Mozilla or Firefox, I just thought ~ would find this interesting.

      13 votes
    8. I challenge you to use Epiphany for a week!

      When Edge died, I got worried about loosing competition to the Blink engine and as such, I went exploring other alternatives to realize.. there's not a whole lot, there's blink, gecko and webkit....

      When Edge died, I got worried about loosing competition to the Blink engine and as such, I went exploring other alternatives to realize.. there's not a whole lot, there's blink, gecko and webkit.

      So with that, I decided to try epiphany - Gnome's web browser. It uses Webkit which is what Blink was forked from so it's not terribly different in theory but the years apart has made that more apparent. It's fairly elegant in my opinion and it lacks some features, sure.


      Anyways, to get to what I wanted to do this week, well, I'd like to challenge you all to use it for a week, mostly for bug hunting purposes and possibly to throw ideas at the project. Worth mentioning, I'm not affiliated with the project, just a user.

      So to make sure we're all on the same page, we'll use the development Epiphany flatpak, this way we can be sure that the problem is in the current codebase. So, to install it :

      Let's install the gnome-nightly repos as per instructions here :

      flatpak remote-add --if-not-exists gnome-nightly https://sdk.gnome.org/gnome-nightly.flatpakrepo
      flatpak remote-add --if-not-exists gnome-apps-nightly --from https://sdk.gnome.org/gnome-apps-nightly.flatpakrepo
      

      Then, let's install the development version by doing so :

      flatpak install org.gnome.Epiphany.Devel
      

      Then just launch it and have fun with it!


      if you run into any bugs, look at the contribution guide here and report the bugs in the repo after checking that the bug is not already present of course!

      12 votes
    9. Batch-saving websites for offline viewing

      Anybody here have a good setup for batch-downloading articles/news from several sites you specify, similar to youtube-dl but for general websites? I'm sure it could be scripted with not too much...

      Anybody here have a good setup for batch-downloading articles/news from several sites you specify, similar to youtube-dl but for general websites? I'm sure it could be scripted with not too much effort but I'm interested what polished solutions there are.

      The idea would be so people with rare internet access could go to a hotspot weekly or something and sync that week's worth of content.

      12 votes
    10. Spiders

      Is anyone here familiar with crawling the web? I’m interested in broad crawling, rather than focusing on particular sites. I’d appreciate pretty much any information about how this is usually...

      Is anyone here familiar with crawling the web? I’m interested in broad crawling, rather than focusing on particular sites. I’d appreciate pretty much any information about how this is usually done, and things to watch out for if attempting it.

      10 votes
    11. How can I push/inspire myself to learn JavaScript and Node?

      I'm a full stack dev and my current use of JavaScript language is limited to making the best (read trivial) use of jquery for DOM manipulation in my web apps which primarily use PHP or Python...

      I'm a full stack dev and my current use of JavaScript language is limited to making the best (read trivial) use of jquery for DOM manipulation in my web apps which primarily use PHP or Python (Flask/Django) as backend.

      Now, have you watched that popular thriller movie called Inception which is based on a radical sci-fi concept that an idea or thought can be implanted into someone remotely?

      Just like that, I often find myself facing this strange idea that JavaScript is supposed to be just a toy running inside the web browser. No idea where exactly this idea came from! Each time I try to learn JS or think of getting deeper with things like npm, react, etc., this idea just pops up and kinda stops me from doing anything!

      Is there any way to get rid of this idea somehow?

      9 votes
    12. Node's "Single Threaded, Event Driven" programming model seems highly deceptive and farcical

      The more I think about it, the more I'm convinced of it. The biggest selling point of Node folks has been the "single threaded, event driven" model, right? Unlike JavaScript, other languages work...

      The more I think about it, the more I'm convinced of it.

      The biggest selling point of Node folks has been the "single threaded, event driven" model, right? Unlike JavaScript, other languages work on a "blocking" basis i.e. you run a statement or command and the program "waits" until the I/O is complete. For example, you issue open('xyz.txt', 'rb').read() in python and the program waits or blocks until the underlying driver is able to read that whole text file (which could take arguably long time if said file is too large in size).

      But with the Nodejs equivalent, you just issue the statement and then pass the "event handler" so that your program is never in the "waiting state". The whole premise of Node/JS event-callback is that "you don't call us, we will call you".

      This is all nice in theory but if this were indeed true then Nodejs scripts should be blazing fast compared to Python and even Java considering that most programs we write are I/O heavy and 99% of time, they're just waiting for an input from a File/URI/User? If this event callback model indeed worked as effectively as claimed, Node would have been the numero one and only language being used today?

      I think I'm starting to understand why that isn't the case. This whole "single threaded, event driven" thing is just a farce. You can also replicate the same thing that Node.js is doing in your Java or Python too by applying multi-threading (i.e. one thread just "waits" for the I/O in the background while the other keeps doing its job). All you've done here is just handed or delegated that complexity of multi-threading to Node.js?

      Realistically, it's impossible to wait or block an I/O request while at the same time also letting the other part of the code engage in other tasks, that's the very definition of multi-threading. Doing "async" is impossible without multiple threads in that sense. Node must have a thread pool of sorts where one of them is engaged in the wait/block while another is running your JS code further. When the wait is over, the control is then passed to the "event handler" function it was bound to in that other thread.

      What Node is selling as "single threaded" applies to application or business logic we are writing, node itself can't be single threaded. I feel it's better to just implement multi-threading in your own code (as needed) instead of using something convoluted and confusing like Node.js. What say you?

      8 votes