I have following simple HTML :
<div id='d1'>
<div id='d2'>
<table id='d3'>
<tr>
<td>T1</td>
<td>T2</td>
<td>T3</td>
<td>T4</td>
<td>T5</td>
<td>T6</td>
<td>T7</td>
</tr>
</table>
</div>
</div>
<br />
<span id='s1'></span>
CSS :
#d1{
width: 400px;
overflow: auto;
direction: ltr;
}
#d2{
width: 800px;
}
#d3{
border:1px solid #ccc;
}
#d3 tr td {
width:120px;
height: 200px;
}
#s1 {
font-size: 25px;
color:#f00;
}
JS / jQuery :
var scroll;
jQuery("#d1").on("scroll", function() {
scroll = jQuery("#d1").scrollLeft();
jQuery("#s1").html(scroll);
});
var scroll = jQuery("#d1").scrollLeft();
jQuery("#s1").html(scroll);
the red number is the value of scroll in any moment, the value is exactly the same in FireFox and Chrome, but when I change the direction to rtl
, things change, See it Online,
In FireFox in the first the scroll value is 0 and when scrolling to left, the value changes to -400
But in Chrome, starts from 400 and when scrolling to left, the value changes to 0
The question is How I must handle this difference when I'm trying to detect if scrollbar is in the first or in the end ? ( As you noticed, Chrome's first is 400, FireFox's first is 0 and ... )
Of course I can detect Browsers and do different calculation, like this StackOverflow Question , But I don't want to do that, because this way I must check every browsers (FF, Chrome,Safari ...) and IE (6,7,8,9,10,11 ...)