Convert Unix timestamps to human-readable dates and vice-versa.
Format: YYYY-MM-DD HH:mm:ss
A Unix timestamp (also known as Unix time, POSIX time, or Epoch time) is a system for describing a point in time. It is the number of seconds that have elapsed since the Unix epoch, which was at 00:00:00 UTC on 1 January 1970.
Because it's a single, universally consistent number, it's not affected by timezones, making it ideal for use in computer systems and data storage worldwide.
The concept is simple: we count the number of seconds from a fixed starting point, the 'epoch'. This tool converts that count back into a human-readable date, or vice-versa.
When converting to a date, you must select the correct timezone to ensure the date and time are displayed accurately for that location.
Traditionally, Unix time is in seconds. However, many modern systems and programming languages (like JavaScript) work with milliseconds to achieve higher precision. A millisecond timestamp is simply the number of milliseconds since the epoch.
This tool automatically detects whether your input is in seconds (a 10-digit number) or milliseconds (a 13-digit number) and handles the conversion correctly.
A timestamp itself is just a number without a timezone. It represents the same moment in time everywhere. However, to display it as a human-readable date (e.g., 'March 15th, 2:00 PM'), we must know *where* in the world we are.
For example, a single timestamp corresponds to 9 AM in Tokyo (UTC+9) but 5 PM the previous day in Los Angeles (UTC-7). Selecting the correct timezone is crucial for accurate representation.