28

我正在使用pySpark,并设置了我的数据框,其中两列代表每日资产价格,如下所示:

ind = sc.parallelize(range(1,5))
prices = sc.parallelize([33.3,31.1,51.2,21.3])
data = ind.zip(prices)
df = sqlCtx.createDataFrame(data,["day","price"])

我在申请时得到df.show()

+---+-----+
|day|price|
+---+-----+
|  1| 33.3|
|  2| 31.1|
|  3| 51.2|
|  4| 21.3|
+---+-----+

这很好。我想要另一列包含价格列的日常收益,即类似

(price(day2)-price(day1))/(price(day1))

经过大量研究,我被告知这是通过应用pyspark.sql.window功能最有效地完成的,但我无法看到如何。

4

2 回答 2

54

您可以使用lag函数带来前一天的列,并添加额外的列来从这两列中实际返回,但您可能必须告诉 spark 如何对数据进行分区和/或命令它执行延迟,像这样:

from pyspark.sql.window import Window
import pyspark.sql.functions as func
from pyspark.sql.functions import lit

dfu = df.withColumn('user', lit('tmoore'))

df_lag = dfu.withColumn('prev_day_price',
                        func.lag(dfu['price'])
                                 .over(Window.partitionBy("user")))

result = df_lag.withColumn('daily_return', 
          (df_lag['price'] - df_lag['prev_day_price']) / df_lag['price'] )

>>> result.show()
+---+-----+-------+--------------+--------------------+
|day|price|   user|prev_day_price|        daily_return|
+---+-----+-------+--------------+--------------------+
|  1| 33.3| tmoore|          null|                null|
|  2| 31.1| tmoore|          33.3|-0.07073954983922816|
|  3| 51.2| tmoore|          31.1|         0.392578125|
|  4| 21.3| tmoore|          51.2|  -1.403755868544601|
+---+-----+-------+--------------+--------------------+

这是Spark 中对 Window 函数的详细介绍。

于 2016-04-20T19:05:59.793 回答
6

滞后功能可以帮助您解决您的用例。

from pyspark.sql.window import Window
import pyspark.sql.functions as func

### Defining the window 
Windowspec=Window.orderBy("day")

### Calculating lag of price at each day level
prev_day_price= df.withColumn('prev_day_price',
                        func.lag(dfu['price'])
                                .over(Windowspec))

### Calculating the average                                  
result = prev_day_price.withColumn('daily_return', 
          (prev_day_price['price'] - prev_day_price['prev_day_price']) / 
prev_day_price['price'] )
于 2019-03-28T11:49:11.420 回答