Convert between Unix timestamps and human-readable dates. Paste a timestamp in seconds or milliseconds to see the corresponding date and time, or enter a date to get its Unix timestamp.
Current Unix timestamp
seconds (Unix epoch)
milliseconds
| Timestamp | Date (UTC) | Description |
|---|
A Unix timestamp (also called epoch time or POSIX time) is the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC — an arbitrary point in time chosen as the Unix epoch when the Unix operating system was being developed.
Because it is a single integer, Unix time is easy to store, compare, sort, and perform arithmetic on. It is the standard way to represent moments in time in databases, APIs, log files, and programming languages.
The original Unix timestamp is measured in seconds. Many modern systems (JavaScript's Date.now(), Java, Kotlin) use milliseconds — 1,000× larger numbers. A timestamp in seconds has 10 digits as of 2026 (around 1.7 billion). A timestamp in milliseconds has 13 digits.
| Format | Example (approx. early 2026) | Range |
|---|---|---|
| Seconds | 1740000000 |
10 digits |
| Milliseconds | 1740000000000 |
13 digits |
| Microseconds | 1740000000000000 |
16 digits |
The converter auto-detects seconds vs. milliseconds based on the magnitude of the input (numbers ≥ 1×10¹¹ are treated as milliseconds by default), but you can override it with the radio buttons.
On 32-bit systems that store Unix time as a signed 32-bit integer, the maximum representable value is 2,147,483,647, which corresponds to January 19, 2038 at 03:14:07 UTC. After this point, the integer overflows and wraps to a large negative number, causing systems to misread the date as 1901.
Most modern systems use 64-bit integers, which push the overflow date to approximately the year 292,277,026,596 — effectively not a concern in practice.
What is the Unix epoch?
The epoch is the starting point: January 1, 1970, 00:00:00 UTC. A Unix timestamp of 0 represents exactly this moment. Negative timestamps represent times before 1970.
Why do JavaScript timestamps use milliseconds?
JavaScript's Date object was designed with millisecond precision from the start. Date.now() and new Date().getTime() both return milliseconds since the Unix epoch. To convert to seconds (as used in most databases and APIs), divide by 1000.
How do I get the current Unix timestamp in code?
Math.floor(Date.now() / 1000)import time; int(time.time())time()date +%sUNIX_TIMESTAMP() (MySQL) / EXTRACT(EPOCH FROM NOW()) (PostgreSQL)Are Unix timestamps affected by time zones?
No. Unix timestamps are always in UTC. The conversion to a human-readable date depends on the local time zone, but the underlying number is always UTC-based.
This website may contain affiliate links. If you click on an affiliate link and make a purchase, we may receive a small commission at no additional cost to you.