To best explain this lets take a hypothetical design: a <section>
tag 600px
wide, 400px
high and inside it a <div>
that we want to style.
Percentages are based on the parent tag, so if we do width: 100%;
the div will be 600px wide as it takes up the full width of it's parent. Same with height: 100%;
, this will fill the parent tag to be 400px.
With width: 100vw;
, this isn’t based on the parent tag but instead the browser width — vw
is short for "viewport width". So the width of the div is now 100% of the browser width itself, which might be smaller or much larger than the parent tag.
Same with height: 100vh;
, this’ll be 100% of the height of the visible height of the page — vh
is short for "viewport height".
In this project there isn't any difference with using either, but potentially there can be a lot of difference so it's something to look out for!
This is because arrays — and most things — in JavaScript start at 0
instead of 1
. So if we look at the slides as being an array of elements like [slide-1, slide-2, slide-3, slide-4]
— nb. this isn't real JavaScript, just an example! — we can see that we have four total slides. Now if we want to get the last slide we can't use slides[4]
because there's no element at index 4
, it's simply undefined
. The last slide is actually slides[3]
and so we need to use totalSlides - 1
to access it, written long-hand without the in-between variables we use in the project this'd look like slides[slides.length - 1]
.