pyspark.sql.functions.timestamp_seconds#

pyspark.sql.functions.timestamp_seconds(col)[source]#

Converts the number of seconds from the Unix epoch (1970-01-01T00:00:00Z) to a timestamp.

New in version 3.1.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
colColumn or str

unix time values.

Returns
Column

converted timestamp value.

Examples

>>> from pyspark.sql.functions import timestamp_seconds
>>> spark.conf.set("spark.sql.session.timeZone", "UTC")
>>> time_df = spark.createDataFrame([(1230219000,)], ['unix_time'])
>>> time_df.select(timestamp_seconds(time_df.unix_time).alias('ts')).show()
+-------------------+
|                 ts|
+-------------------+
|2008-12-25 15:30:00|
+-------------------+
>>> time_df.select(timestamp_seconds('unix_time').alias('ts')).printSchema()
root
 |-- ts: timestamp (nullable = true)
>>> spark.conf.unset("spark.sql.session.timeZone")