I have a script, actually in Java, but nevermind that. And it's supposed to let the user print in starttime in hours, minutes and seconds. And then the endtime in hours, minutes and seconds.
Then it calculates how many hours, minutes, and seconds it took.
Obviously it's a bit trickier than just endseconds - startseconds etc, since the startseconds could be more than the endseconds, which would result in that it took 1 more minute.
I think I recall that I had to use "%" but not sure how.
Please help. If it's to any help, here's the javacode if someone knows java, but it's really just the math calculation I need help with.
Ye, they solved it the same way I did basicly. Calculate to seconds, then devide it to get hours, extract those seconds. Devide the new number of seconds to get minutes, extract those seconds. Then you get the rest of the seconds..