Timestamp Converter

Convert Unix timestamps to human-readable dates and back.

Seconds & millisecondsUTC & local timeISO 8601 format

① Enter a Unix timestamp or click Now ② Convert to human-readable date ③ Copy any format

Current Unix timestamp: 1773589955
Unix Epoch
January 1, 1970 00:00:00 UTC
Seconds vs Milliseconds
10 digits = seconds, 13 digits = milliseconds
ISO 8601
YYYY-MM-DDTHH:mm:ssZ — universal datetime standard

What is a Unix Timestamp?

A Unix timestamp (also called epoch time) is the number of seconds elapsed since January 1, 1970 at 00:00:00 UTC. It is used in programming to represent a specific point in time in a timezone-independent way. Millisecond timestamps have 13 digits and are used in JavaScript and many modern APIs.

Frequently Asked Questions

A Unix timestamp is the number of seconds that have elapsed since January 1, 1970 (the Unix epoch). It provides a simple, timezone-independent way to represent points in time.
Divide the timestamp by 1000 if in milliseconds, then use new Date(timestamp * 1000) in JavaScript, or pass it to a date library. This tool does it automatically.
A 10-digit timestamp is in seconds; a 13-digit timestamp is in milliseconds. JavaScript uses milliseconds by default, while Unix systems traditionally use seconds.
The Unix epoch is January 1, 1970 at 00:00:00 UTC. It was chosen as a reference point when Unix was developed in the late 1960s. All Unix timestamps count from this moment.
Yes. Negative timestamps represent dates before January 1, 1970. For example, -86400 equals December 31, 1969 at 00:00:00 UTC.
Timestamps are used in databases, log files, APIs, file systems, cookies and JWT tokens. They are timezone-independent and easy to compare or sort arithmetically.
UTC (Coordinated Universal Time) is the global time standard with no daylight saving adjustments. Local time is UTC offset by your timezone. Timestamps always represent UTC internally.

Explore Our Network