Welcome to David's Space

This is my remarkably inventive space, created as part of the course 1DV022 – Klientbaserad webbprogrammering (Client-side web programming). The purpose of this site, is to learn how to set up a basic static web site using an SSG (Static Site Generator), as well as using a CSS pre-processor – in this case SASS.

According to the assignment specifications, you will find some information about me, as well as to a simple blog implementation containing some reflections about SSG’s, CSS pre-processors, and so on.

Finally, the site may in the future be used to display other exercises and assignments from this or other courses. We’ll have to wait and see about that.

Regards,
David.

Posts

  • Open Graph in Jekyll

    “The Open Graph protocol enables any web page to become a rich object in a social graph.” (Open Graph, 2019) In practice, this means Facebook, Slack, Twitter and other platforms may fetch images and descriptions, displaying your link as a “card”, rather than a plain text link. Which again translates to user friendliness and considerably more clicks.

  • Implementing Disqus Comments in Jekyll

    Jekyll is a framework for generating static pages. User input and comments is not static content, so what to do? Well, we may still embed dynamic content, and that’s what the assignment asked us to do:

  • Using CSS Preprocessors

    I was very comfortable with HTML and CSS back in the days, and had a significant interest in “working smart – not hard”. I read up on different resources, like the Wordpress style guide, Object Oriented CSS and Inuit CSS.

  • On Static Site Generators (SSGs)

    This site is implemented through Jekyll, an SSG, or Static Site Generator. SSGs are frameworks for simplifying development and maintenance of static web sites, trough techniques like includes, variables, layouts, etc. I.e. “Don’t Repeat Yourself”.

  • A Human Touch

    humans.txt is a quite optional, simple text file, denoting the humans involved with the web site. It is a simple way of providing a “human touch” to any site, and giving credit where credit is due.

  • To my robots…

    robots.txt is a simple text file placed in the web server root, providing web crawlers with instructions for them to abide by or disregard. The main purpose is preventing web crawlers from indexing undesirable areas of the web server, and ensure proper listings in search engines.

subscribe via RSS