Hostel La Verde, Cienfuegos, Cuba


Hostal La Verde screenshotHostal La Verde

Hostal La Verde, a Cuban hostal website

A small website for a Hostal in Cuba.

Bookings are permitted as are contacting the site owner for enquiries etc.

This was a fair simple wordpress site with a custom booking system with a separate database.

Responsive HTML


An example of creating responsive design using CSS.

There are a range of ways to develop responsive website using CSS and smart use of meta tags.

<meta name="viewport" content="">

This tag was introduced by Apple and now has support from a few browsers. It was introduced to override the initial view size of pages loaded on a mobile device – to overcome the issue of the mobile browser automatically resizing the viewport to fit all content in the page; as auto sizing the browser is not always suitable to the content displayed, using this method is equivalent to using a CSS reset to remove default styles.

For example, if one’s mobile design is intended to be 320px in width, one can specify the default viewport width:

<meta name="viewport" content="width=320">

Using CSS, one can use the @media and max/min widths to set breakpoints to cause the layout to change.

There are two ways to define the width:

1) By adding the media attribute to CSS references, meaning the CSS file can only be used for the widths defined, i.e.

<link rel="stylesheet" media=screen and width (min-width: 720px) and (max-width: 980px) href="path/to.css" />

2) By using the @media in the CSS file, i.e.

@media all and (max-width: 960px) and (min-width: 720px)  {
        body {
                background: #ccc;

One can also use CSS to target the view-port instead of using metatags. Using CSS for webkit browsers and CSS specifically for IE10. (IE9 and below don’t support the viewport).

        zoom: 1.0;
       width: extend-to-zoom;

This isn’t supported by all browsers like IE, which instead uses an IE10-specific CSS. 

@-ms-viewport {
       width: extend-to-zoom;
       zoom: 1.0;

HTML 5 Player


An example of a HTML5 Audio player; using the AUDIO tag and javascript.

To begin with, the audio tag, which is  is standard.  No bells needed to be attached.

<audio id="player" controls="controls">
<source src="anaudio.mp3" type="audio/mpeg">
<source src="anaudio.ogg" type="audio/ogg">
Your browser does not support the audio element.

This is what will play the audio.

Next the HTML

<canvas id="Canvas-2" width="600" height="120">
This is the 'played' background colour
<canvas id="Canvas-1" width="600" height="120">
This is the 'unplayed' background colour.
<canvas id="Canvas0" width="600" height="120">
This is the waveform

Three Canvases, one for background colours (played/unplayed) and one for the waveform image.

canvas {left:0;top:0;position:absolute;}
#playedpositioncanvas {z-index:999} /*Canvas-2*/
#unplayedpositioncanvas {z-index:998} /*Canvas-1*/
#wavecanvas {z-index:100001} /*Canvas0 - wave form canvas on top*/

Canvas0 contains the waveform (the waveform can be created either using an audio viewer and saving it, or by using some code).  This version was created from some refactored PHP code found on GitHub (see below site link).  However the waveform is achieved, it needs to be rendered in Canvas0 on page load.

Canvas-1 contains the ‘unplayed’ background colour.

Canvas-2 contains the ‘played’ background colour.

The colour change occurs by means of a javascript event listener on the play event of the audio tag.  Basic functionality hides Canvas-1 and shows Canvas-2 as the audio track progresses.

audio.addEventListener("play", function(){})

Event listeners can be attached to lots of different types of events; in the case of the player the main events of interest include the play and timeupdate events.

The overall time of the song is used to calculate the percentage completed (and hence remaining).

// define the audio as an object
audio = document.getElementById("player");
// set up the canvases
//canvas for the waveform
wavecanvas = document.getElementById('wavecanvas');
wavecanvascontext = wavecanvas.getContext('2d');
// canvas for the unplayed background colour
unplayedpositioncanvas = document.getElementById('unplayedpositioncanvas');
unplayedpositioncanvascontext = unplayedpositioncanvas.getContext('2d');
// canvas for the played background colour
playedpositioncanvas = document.getElementById('playedpositioncanvas');
playedpositioncanvascontext = playedpositioncanvas.getContext('2d');
// variables for timer, width of the canvas to where the click event occurred and a Boolean of // whether it is the first play (in which case, what state to show the canvas in)
var timerID;
widthClickedXCoordOffset = 0;
playedFirstTime = 0;
// set the unplayed background to be red
unplayedpositioncanvascontext.fillStyle = "red";
unplayedpositioncanvascontext.fillRect( 0, 0, 600, 120 );
// create a timer to countdown the duration of the song
// divided by the image width to get countdown per pixel
function startPlayTimer(duration, widthClickedXCoordOffset) {
console.log('timer started');
x_pos = widthClickedXCoordOffset;
// timer set to fire when the 'drawPlayerBg' function is fired
timerID = setInterval( "drawPlayingBg(unplayedpositioncanvascontext)", duration);
// stop the count down - i.e. when the song ends
function stopTimer() {
console.log('stop done');
// draw the played background as the time progresses
function drawPlayingBg(unplayedpositioncanvascontext) {
console.log('drawPlaingBg done');
playedpositioncanvascontext.clearRect(0, 0, unplayedpositioncanvas.width, unplayedpositioncanvas.height);
playedpositioncanvascontext.fillRect( 0, 0, x_pos, 120 );
x_pos = x_pos + 1;
// fired when a song has been selected
function DoThePlayer(unplayedpositioncanvascontext, unplayedpositioncanvas) {
console.log('DoThePlayer done');
// event listener when the player starts
audio.addEventListener("play", function() {
//remove a class from the border canvas when the song plays - this removes the border
unplayedpositioncanvascontext.fillStyle = "red";
unplayedpositioncanvascontext.fillRect( 0, 0, 600, 120 );
//get the duration and the current time
var duration = parseInt( audio.duration ),
currenttime = parseInt( audio.currentTime )+1;
//calculate the percentage of the audio track that has been played
var perc = parseInt( audio.currentTime )/ $(this).width() * 100;
var percplayback = perc * parseInt( audio.duration ) / 100;
}, false);
// event listener when the player ends
audio.addEventListener("ended", function() {
console.log('song ended');
}, false);
// event listener when the player is paused (currently unused)
audio.addEventListener("pause", function() {
console.log('song paused');
var perc = parseInt( audio.currentTime )/ $(this).width() * 100;
var percplayback = perc * parseInt( audio.duration ) / 100;
}, false);
// Countdown time remaining of audio
audio.addEventListener("timeupdate", function() {
console.log('song timeupdated');
//define the vars
var img2 = document.getElementById('wavecanvas');
//useful vars on the time
var timeleft = document.getElementById('timeleft'),
duration = parseInt( audio.duration ),
currentTime = parseInt( audio.currentTime ),
timeLeft = duration - currentTime,
//the percentage of time remaining is calculated based on the time remaining and the total time
percentageLeft = (parseInt(currentTime) / parseInt(duration)) * 100,
s, m;
}, false);
function changeTime(audio, percplayback) {
console.log('time changed');
audio.currentTime = percplayback;;
function setwidthClickedXCoordOffset(returnedwidthClickedXCoordOffset) {
console.log('setwidthClickedXCoordOffset done');
widthClickedXCoordOffset = returnedwidthClickedXCoordOffset;

The end results is here: