Skip to content

Commit 8054aed

Browse files
committed
backup handle-missing-value-II
1 parent 5f41272 commit 8054aed

9 files changed

+993
-1028
lines changed

Handle-Missing-Value-II.Rmd

+846-983
Large diffs are not rendered by default.
Loading
Loading

binary-Q1CGARCH.Rmd

+20-12
Original file line numberDiff line numberDiff line change
@@ -39,7 +39,15 @@ rm(pkgs)
3939

4040
# Introduction
4141

42-
<span style='color:ForestGreen'>Enjoy the Joy of Copulas - With a Package copula</span>
42+
Due to the paper <span style='color:goldenrod'>*binary.com Interview Question I - Multivariate GARCH Models*</span> unable to execute the copula-DCC-GARCH models smoothly where I believed the Copula-GARCH and also GO-GARCH models will be better fit than normal DCC models, here I try to refer to more reference like <span style='color:goldenrod'>*Higher Moment Models for Risk and Portfolio Management*</span> etc. The author published a thesis which contains a series of GARCH models.
43+
44+
Yihui, Xie used to publish a post which is [我手头关于Copula的20篇论文资料(下载)](https://d.cosx.org/d/6181-6181), the Copula-GARCH model widely use in financial market. <span style='color:goldenrod'>*Enjoy the Joy of Copulas - With a Package `copula`*</span>...
45+
46+
# Data
47+
48+
49+
# Modelling
50+
4351

4452

4553
# Conclusion
@@ -66,16 +74,14 @@ It's useful to record some information about how your file was created.
6674
- Additional session information:
6775

6876
```{r info, echo=FALSE, warning=FALSE, results='asis'}
69-
sys1 <- devtools::session_info()$platform %>%
70-
unlist %>% data.frame(Category = names(.), session_info = .)
77+
sys1 <- session_info()$platform %>%
78+
unlist %>%
79+
data.frame(Category = names(.), session_info = .)
7180
rownames(sys1) <- NULL
7281
73-
#sys1 %<>% rbind(., data.frame(
74-
# Category = 'Current time',
75-
# session_info = paste(as.character(lubridate::now('Asia/Tokyo')), 'JST'))) %>%
76-
# dplyr::filter(Category != 'os')
77-
78-
sys2 <- data.frame(Sys.info()) %>% mutate(Category = rownames(.)) %>% .[2:1]
82+
sys2 <- data.frame(Sys.info()) %>%
83+
mutate(Category = rownames(.)) %>%
84+
.[2:1]
7985
names(sys2)[2] <- c('Sys.info')
8086
rownames(sys2) <- NULL
8187
@@ -98,9 +104,11 @@ rm(sys1, sys2)
98104

99105
## Reference
100106

101-
01. [Enjoy the Joy of Copulas - With a Package copula](Enjoy the Joy of Copulas - With a Package copula.pdf)
102-
02. [Modeling Multivariate Count Data using Copulas](Modeling Multivariate Count Data using Copulas.pdf)
103-
03. [Modeling Multivariate Distributions using Copulas](Modeling Multivariate Distributions using Copulas.pdf)
107+
01. [Enjoy the Joy of Copulas - With a Package `copula`](https://github.com/englianhu/binary.com-interview-question/blob/master/reference/Enjoy%20the%20Joy%20of%20Copulas%20-%20With%20a%20ackage%20copula.pdf)
108+
02. [Modeling Multivariate Count Data using Copulas](https://github.com/englianhu/binary.com-interview-question/blob/master/reference/Modeling%20Multivariate%20Count%20Data%20using%20Copulas.pdf)
109+
03. [Modeling Multivariate Distributions using Copulas](https://github.com/englianhu/binary.com-interview-question/blob/master/reference/Modeling%20Multivariate%20Distributions%20using%20Copulas.pdf)
110+
04. [Higher Moment Models for Risk and Portfolio Management](https://github.com/englianhu/binary.com-interview-question/blob/master/reference/Higher%20Moment%20Models%20for%20Risk%20and%20Portfolio%20Management.pdf)
111+
05. [binary.com Interview Question I - Multivariate GARCH Models](http://rpubs.com/englianhu/binary-Q1Multi-GARCH)
104112

105113
---
106114

binary-Q1CGARCH.html

+36-21
Large diffs are not rendered by default.

binary-Q1Inter-HFT.Rmd

+2-1
Original file line numberDiff line numberDiff line change
@@ -1419,7 +1419,8 @@ It's useful to record some information about how your file was created.
14191419

14201420
```{r info, echo=FALSE, warning=FALSE, results='asis'}
14211421
sys1 <- session_info()$platform %>%
1422-
unlist %>% data.frame(Category = names(.), session_info = .)
1422+
unlist %>%
1423+
data.frame(Category = names(.), session_info = .)
14231424
rownames(sys1) <- NULL
14241425
14251426
sys2 <- data.frame(Sys.info()) %>%

logs/log4j.spark.log

+78-11
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,78 @@
1-
18/10/23 15:38:44 INFO SparkContext: Invoking stop() from shutdown hook
2-
18/10/23 15:38:44 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4041
3-
18/10/23 15:38:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
4-
18/10/23 15:38:44 INFO MemoryStore: MemoryStore cleared
5-
18/10/23 15:38:44 INFO BlockManager: BlockManager stopped
6-
18/10/23 15:38:44 INFO BlockManagerMaster: BlockManagerMaster stopped
7-
18/10/23 15:38:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
8-
18/10/23 15:38:44 INFO SparkContext: Successfully stopped SparkContext
9-
18/10/23 15:38:44 INFO ShutdownHookManager: Shutdown hook called
10-
18/10/23 15:38:44 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-96fd60e0-c499-441b-971d-e8c65875c935
11-
18/10/23 15:38:44 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-36507bed-827c-4f38-8a2e-a362bd113c59
1+
18/10/24 20:32:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2+
18/10/24 20:32:34 INFO SparkContext: Running Spark version 2.3.2
3+
18/10/24 20:32:34 INFO SparkContext: Submitted application: sparklyr
4+
18/10/24 20:32:34 INFO SecurityManager: Changing view acls to: scibr
5+
18/10/24 20:32:34 INFO SecurityManager: Changing modify acls to: scibr
6+
18/10/24 20:32:34 INFO SecurityManager: Changing view acls groups to:
7+
18/10/24 20:32:34 INFO SecurityManager: Changing modify acls groups to:
8+
18/10/24 20:32:34 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(scibr); groups with view permissions: Set(); users with modify permissions: Set(scibr); groups with modify permissions: Set()
9+
18/10/24 20:32:35 INFO Utils: Successfully started service 'sparkDriver' on port 50109.
10+
18/10/24 20:32:35 INFO SparkEnv: Registering MapOutputTracker
11+
18/10/24 20:32:35 INFO SparkEnv: Registering BlockManagerMaster
12+
18/10/24 20:32:35 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
13+
18/10/24 20:32:35 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
14+
18/10/24 20:32:35 INFO DiskBlockManager: Created local directory at C:\Users\scibr\AppData\Local\Temp\blockmgr-42837de4-0ea7-4343-9942-202186a58ec8
15+
18/10/24 20:32:35 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
16+
18/10/24 20:32:36 INFO SparkEnv: Registering OutputCommitCoordinator
17+
18/10/24 20:32:36 INFO Utils: Successfully started service 'SparkUI' on port 4040.
18+
18/10/24 20:32:36 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://127.0.0.1:4040
19+
18/10/24 20:32:36 INFO SparkContext: Added JAR file:/C:/Users/scibr/Documents/R/win-library/3.5/sparklyr/java/sparklyr-2.3-2.11.jar at spark://127.0.0.1:50109/jars/sparklyr-2.3-2.11.jar with timestamp 1540380756770
20+
18/10/24 20:32:36 INFO Executor: Starting executor ID driver on host localhost
21+
18/10/24 20:32:37 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50131.
22+
18/10/24 20:32:37 INFO NettyBlockTransferService: Server created on 127.0.0.1:50131
23+
18/10/24 20:32:37 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
24+
18/10/24 20:32:37 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 127.0.0.1, 50131, None)
25+
18/10/24 20:32:37 INFO BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:50131 with 366.3 MB RAM, BlockManagerId(driver, 127.0.0.1, 50131, None)
26+
18/10/24 20:32:37 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 127.0.0.1, 50131, None)
27+
18/10/24 20:32:37 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 127.0.0.1, 50131, None)
28+
18/10/24 20:32:38 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
29+
18/10/24 20:32:50 INFO SparkContext: Invoking stop() from shutdown hook
30+
18/10/24 20:32:50 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
31+
18/10/24 20:32:50 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
32+
18/10/24 20:32:50 INFO MemoryStore: MemoryStore cleared
33+
18/10/24 20:32:50 INFO BlockManager: BlockManager stopped
34+
18/10/24 20:32:50 INFO BlockManagerMaster: BlockManagerMaster stopped
35+
18/10/24 20:32:50 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
36+
18/10/24 20:32:50 INFO SparkContext: Successfully stopped SparkContext
37+
18/10/24 20:32:50 INFO ShutdownHookManager: Shutdown hook called
38+
18/10/24 20:32:50 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-5e26a35b-7700-4fe5-b4cd-9045c7253e3a
39+
18/10/24 20:32:50 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-6b3b50ff-32dc-4256-af6f-adbe92b866ff
40+
18/10/24 20:37:30 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
41+
18/10/24 20:37:32 INFO SparkContext: Running Spark version 2.3.2
42+
18/10/24 20:37:32 INFO SparkContext: Submitted application: sparklyr
43+
18/10/24 20:37:32 INFO SecurityManager: Changing view acls to: scibr
44+
18/10/24 20:37:32 INFO SecurityManager: Changing modify acls to: scibr
45+
18/10/24 20:37:32 INFO SecurityManager: Changing view acls groups to:
46+
18/10/24 20:37:32 INFO SecurityManager: Changing modify acls groups to:
47+
18/10/24 20:37:32 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(scibr); groups with view permissions: Set(); users with modify permissions: Set(scibr); groups with modify permissions: Set()
48+
18/10/24 20:37:32 INFO Utils: Successfully started service 'sparkDriver' on port 50206.
49+
18/10/24 20:37:32 INFO SparkEnv: Registering MapOutputTracker
50+
18/10/24 20:37:32 INFO SparkEnv: Registering BlockManagerMaster
51+
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
52+
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
53+
18/10/24 20:37:32 INFO DiskBlockManager: Created local directory at C:\Users\scibr\AppData\Local\Temp\blockmgr-9b3cbfbd-5baa-4af8-b300-189da3dcae01
54+
18/10/24 20:37:32 INFO MemoryStore: MemoryStore started with capacity 366.3 MB
55+
18/10/24 20:37:32 INFO SparkEnv: Registering OutputCommitCoordinator
56+
18/10/24 20:37:32 INFO Utils: Successfully started service 'SparkUI' on port 4040.
57+
18/10/24 20:37:32 INFO SparkUI: Bound SparkUI to 127.0.0.1, and started at http://127.0.0.1:4040
58+
18/10/24 20:37:32 INFO SparkContext: Added JAR file:/C:/Users/scibr/Documents/R/win-library/3.5/sparklyr/java/sparklyr-2.3-2.11.jar at spark://127.0.0.1:50206/jars/sparklyr-2.3-2.11.jar with timestamp 1540381052772
59+
18/10/24 20:37:32 INFO Executor: Starting executor ID driver on host localhost
60+
18/10/24 20:37:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50227.
61+
18/10/24 20:37:32 INFO NettyBlockTransferService: Server created on 127.0.0.1:50227
62+
18/10/24 20:37:32 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
63+
18/10/24 20:37:32 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 127.0.0.1, 50227, None)
64+
18/10/24 20:37:32 INFO BlockManagerMasterEndpoint: Registering block manager 127.0.0.1:50227 with 366.3 MB RAM, BlockManagerId(driver, 127.0.0.1, 50227, None)
65+
18/10/24 20:37:32 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 127.0.0.1, 50227, None)
66+
18/10/24 20:37:32 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 127.0.0.1, 50227, None)
67+
18/10/24 20:37:33 WARN SparkContext: Using an existing SparkContext; some configuration may not take effect.
68+
18/10/24 20:37:36 INFO SparkContext: Invoking stop() from shutdown hook
69+
18/10/24 20:37:36 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4040
70+
18/10/24 20:37:36 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
71+
18/10/24 20:37:36 INFO MemoryStore: MemoryStore cleared
72+
18/10/24 20:37:36 INFO BlockManager: BlockManager stopped
73+
18/10/24 20:37:36 INFO BlockManagerMaster: BlockManagerMaster stopped
74+
18/10/24 20:37:36 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
75+
18/10/24 20:37:36 INFO SparkContext: Successfully stopped SparkContext
76+
18/10/24 20:37:36 INFO ShutdownHookManager: Shutdown hook called
77+
18/10/24 20:37:36 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-4f2683b7-49df-4b71-8096-73acfb00a16e
78+
18/10/24 20:37:36 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-48587667-83df-4a73-8fe8-94189b1125cf

logs/log4j.spark.log.2018-10-23

+11
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
18/10/23 15:38:44 INFO SparkContext: Invoking stop() from shutdown hook
2+
18/10/23 15:38:44 INFO SparkUI: Stopped Spark web UI at http://127.0.0.1:4041
3+
18/10/23 15:38:44 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
4+
18/10/23 15:38:44 INFO MemoryStore: MemoryStore cleared
5+
18/10/23 15:38:44 INFO BlockManager: BlockManager stopped
6+
18/10/23 15:38:44 INFO BlockManagerMaster: BlockManagerMaster stopped
7+
18/10/23 15:38:44 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
8+
18/10/23 15:38:44 INFO SparkContext: Successfully stopped SparkContext
9+
18/10/23 15:38:44 INFO ShutdownHookManager: Shutdown hook called
10+
18/10/23 15:38:44 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-96fd60e0-c499-441b-971d-e8c65875c935
11+
18/10/23 15:38:44 INFO ShutdownHookManager: Deleting directory C:\Users\scibr\AppData\Local\Temp\spark-36507bed-827c-4f38-8a2e-a362bd113c59
Binary file not shown.

0 commit comments

Comments
 (0)