What is a Unix Timestamp? Epoch Time Explained Simply

A Unix timestamp is a single integer representing the number of seconds that have elapsed since January 1, 1970, at 00:00:00 UTC. It is the universal language of time in software development.

Why January 1, 1970?

When Unix was developed in the late 1960s and early 1970s at Bell Labs, the engineers needed a consistent reference point for storing time. They chose January 1, 1970, 00:00:00 UTC — a date close to when Unix was created — as the "epoch." Every moment in time is expressed as an offset from this epoch in seconds.

The choice was somewhat arbitrary, but it has become the global standard. Every programming language, operating system, and database understands Unix timestamps.

Seconds vs milliseconds

The original Unix timestamp counts seconds. However, many modern systems — especially JavaScript — use milliseconds for better precision. You can tell the difference by the number of digits:

ValueUnitRepresents
1700000000Seconds (10 digits)Nov 14, 2023
1700000000000Milliseconds (13 digits)Nov 14, 2023

If your timestamp has 13 digits, divide by 1000 to get seconds. If it has 10, it's already in seconds.

How to get the current timestamp

multiple languages
# JavaScript
Math.floor(Date.now() / 1000)   // seconds
Date.now()                       // milliseconds

# Python
import time
int(time.time())                 # seconds

# PHP
time()                           # seconds

# Bash / Linux terminal
date +%s                         # seconds

# SQL (MySQL / PostgreSQL)
SELECT UNIX_TIMESTAMP();         -- MySQL
SELECT EXTRACT(EPOCH FROM NOW()); -- PostgreSQL

How to convert a timestamp to a readable date

javascript
// Unix timestamp (seconds) to Date
const ts = 1700000000;
const date = new Date(ts * 1000); // multiply by 1000 for ms
console.log(date.toISOString());  // "2023-11-14T22:13:20.000Z"
console.log(date.toLocaleString()); // local time string

// Date to Unix timestamp
const now = new Date();
const unixSeconds = Math.floor(now.getTime() / 1000);

Why timestamps are better than date strings

  • Timezone-independent: 1700000000 means the same instant everywhere. "2023-11-14 22:13" depends on which timezone you're in.
  • Easy arithmetic: Subtract two timestamps to get the duration in seconds. No date parsing needed.
  • Sortable: Timestamps sort correctly as plain integers. ISO date strings also sort correctly, but locale-formatted dates do not.
  • Compact: A 10-digit integer takes less space than a formatted date string.
  • Universal: Every language and database understands Unix timestamps natively.

The year 2038 problem

On 32-bit systems, Unix time is stored as a signed 32-bit integer, which maxes out at 2,147,483,647 — representing January 19, 2038. After that, the counter overflows to a negative number. Modern 64-bit systems are not affected — they can store timestamps hundreds of billions of years into the future.

Reference timestamps

TimestampDate
0Jan 1, 1970 00:00:00 UTC (epoch)
1000000000Sep 9, 2001
1234567890Feb 13, 2009 (internet celebrated this)
1700000000Nov 14, 2023
2000000000May 18, 2033
2147483647Jan 19, 2038 (32-bit max)

Convert any Unix timestamp to a readable date — or a date to a timestamp — instantly.

Open Timestamp Converter →
← Previous
How to Decode a JWT Token
Next →
JSON.stringify vs JSON.parse