Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
652 commits
Select commit Hold shift + click to select a range
092e2f1
SPARK-2425 Don't kill a still-running Application because of some mis…
markhamstra Sep 9, 2014
ce5cb32
[Build] Removed -Phive-thriftserver since this profile has been removed
liancheng Sep 9, 2014
c419e4f
[Docs] actorStream storageLevel default is MEMORY_AND_DISK_SER_2
melrief Sep 9, 2014
1e03cf7
[SPARK-3455] [SQL] **HOT FIX** Fix the unit test failure
chenghao-intel Sep 9, 2014
88547a0
SPARK-3422. JavaAPISuite.getHadoopInputSplits isn't used anywhere.
sryza Sep 9, 2014
f0f1ba0
SPARK-3404 [BUILD] SparkSubmitSuite fails with "spark-submit exits wi…
srowen Sep 9, 2014
2686233
[SPARK-3193]output errer info when Process exit code is not zero in t…
scwf Sep 9, 2014
02b5ac7
Minor - Fix trivial compilation warnings.
ScrapCodes Sep 9, 2014
07ee4a2
[SPARK-3176] Implement 'ABS and 'LAST' for sql
Sep 9, 2014
c110614
[SPARK-3448][SQL] Check for null in SpecificMutableRow.update
liancheng Sep 10, 2014
25b5b86
[SPARK-3458] enable python "with" statements for SparkContext
Sep 10, 2014
b734ed0
[SPARK-3395] [SQL] DSL sometimes incorrectly reuses attribute ids, br…
Sep 10, 2014
6f7a768
[SPARK-3286] - Cannot view ApplicationMaster UI when Yarn’s url schem…
Sep 10, 2014
a028330
[SPARK-3362][SQL] Fix resolution for casewhen with nulls.
adrian-wang Sep 10, 2014
f0c87dc
[SPARK-3363][SQL] Type Coercion should promote null to all other types.
adrian-wang Sep 10, 2014
26503fd
[HOTFIX] Fix scala style issue introduced by #2276.
JoshRosen Sep 10, 2014
1f4a648
SPARK-1713. Use a thread pool for launching executors.
sryza Sep 10, 2014
e4f4886
[SPARK-2096][SQL] Correctly parse dot notations
cloud-fan Sep 10, 2014
558962a
[SPARK-3411] Improve load-balancing of concurrently-submitted drivers…
WangTaoTheTonic Sep 10, 2014
79cdb9b
[SPARK-2207][SPARK-3272][MLLib]Add minimum information gain and minim…
Sep 10, 2014
84e2c8b
[SQL] Add test case with workaround for reading partitioned Avro files
marmbrus Sep 11, 2014
f92cde2
[SPARK-3447][SQL] Remove explicit conversion with JListWrapper to avo…
marmbrus Sep 11, 2014
c27718f
[SPARK-2781][SQL] Check resolution of LogicalPlans in Analyzer.
staple Sep 11, 2014
ed1980f
[SPARK-2140] Updating heap memory calculation for YARN stable and alpha.
Sep 11, 2014
1ef656e
[SPARK-3047] [PySpark] add an option to use str in textFileRDD
davies Sep 11, 2014
ca83f1e
[SPARK-2917] [SQL] Avoid table creation in logical plan analyzing for…
chenghao-intel Sep 11, 2014
4bc9e04
[SPARK-3390][SQL] sqlContext.jsonRDD fails on a complex structure of …
yhuai Sep 11, 2014
6324eb7
[Spark-3490] Disable SparkUI for tests
andrewor14 Sep 12, 2014
ce59725
[SPARK-3429] Don't include the empty string "" as a defaultAclUser
ash211 Sep 12, 2014
f858f46
SPARK-3462 push down filters and projections into Unions
Sep 12, 2014
33c7a73
SPARK-2482: Resolve sbt warnings during build
witgo Sep 12, 2014
42904b8
[SPARK-3465] fix task metrics aggregation in local mode
davies Sep 12, 2014
b8634df
[SPARK-3160] [SPARK-3494] [mllib] DecisionTree: eliminate pre-alloca…
jkbradley Sep 12, 2014
f116f76
[SPARK-2558][DOCS] Add --queue example to YARN doc
kramimus Sep 12, 2014
5333776
[PySpark] Add blank line so that Python RDD.top() docstring renders c…
rnowling Sep 12, 2014
8194fc6
[SPARK-3481] [SQL] Eliminate the error log in local Hive comparison test
chenghao-intel Sep 12, 2014
eae81b0
MAINTENANCE: Automated closing of pull requests.
pwendell Sep 12, 2014
15a5645
[SPARK-3427] [GraphX] Avoid active vertex tracking in static PageRank
ankurdave Sep 12, 2014
1d76796
SPARK-3014. Log a more informative messages in a couple failure scena…
sryza Sep 12, 2014
af25838
[SPARK-3217] Add Guava to classpath when SPARK_PREPEND_CLASSES is set.
Sep 12, 2014
25311c2
[SPARK-3456] YarnAllocator on alpha can lose container requests to RM
tgravescs Sep 13, 2014
71af030
[SPARK-3094] [PySpark] compatitable with PyPy
davies Sep 13, 2014
885d162
[SPARK-3500] [SQL] use JavaSchemaRDD as SchemaRDD._jschema_rdd
davies Sep 13, 2014
6d887db
[SPARK-3515][SQL] Moves test suite setup code to beforeAll rather tha…
liancheng Sep 13, 2014
2584ea5
[SPARK-3469] Make sure all TaskCompletionListener are called even wit…
rxin Sep 13, 2014
e11eeb7
[SQL][Docs] Update SQL programming guide to show the correct default …
yhuai Sep 13, 2014
feaa370
SPARK-3470 [CORE] [STREAMING] Add Closeable / close() to Java context…
srowen Sep 13, 2014
b4dded4
Proper indent for the previous commit.
rxin Sep 13, 2014
a523cea
[SQL] [Docs] typo fixes
nchammas Sep 13, 2014
184cd51
[SPARK-3481][SQL] Removes the evil MINOR HACK
liancheng Sep 13, 2014
7404924
[SPARK-3294][SQL] Eliminates boxing costs from in-memory columnar sto…
liancheng Sep 13, 2014
0f8c4ed
[SQL] Decrease partitions when testing
marmbrus Sep 13, 2014
2aea0da
[SPARK-3030] [PySpark] Reuse Python worker
davies Sep 13, 2014
4e3fbe8
[SPARK-3463] [PySpark] aggregate and show spilled bytes in Python
davies Sep 14, 2014
c243b21
SPARK-3039: Allow spark to be built using avro-mapred for hadoop2
bbossy Sep 15, 2014
f493f79
[SPARK-3452] Maven build should skip publishing artifacts people shou…
ScrapCodes Sep 15, 2014
cc14644
[SPARK-3410] The priority of shutdownhook for ApplicationMaster shoul…
sarutak Sep 15, 2014
fe2b1d6
[SPARK-3425] do not set MaxPermSize for OpenJDK 1.8
Sep 15, 2014
e59fac1
[SPARK-3518] Remove wasted statement in JsonProtocol
sarutak Sep 15, 2014
37d9252
[SPARK-2714] DAGScheduler logs jobid when runJob finishes
YanTangZhai Sep 15, 2014
3b93128
[SPARK-3396][MLLIB] Use SquaredL2Updater in LogisticRegressionWithSGD
BigCrunsh Sep 16, 2014
983d6a9
[MLlib] Update SVD documentation in IndexedRowMatrix
rezazadeh Sep 16, 2014
fdb302f
[SPARK-3516] [mllib] DecisionTree: Add minInstancesPerNode, minInfoGa…
Sep 16, 2014
da33acb
[SPARK-2951] [PySpark] support unpickle array.array for Python 2.6
davies Sep 16, 2014
60050f4
[SPARK-1087] Move python traceback utilities into new traceback_utils…
staple Sep 16, 2014
d428ac6
[SPARK-3540] Add reboot-slaves functionality to the ec2 script
rxin Sep 16, 2014
ecf0c02
[SPARK-3433][BUILD] Fix for Mima false-positives with @DeveloperAPI a…
ScrapCodes Sep 16, 2014
febafef
[SPARK-3040] pick up a more proper local ip address for Utils.findLoc…
advancedxy Sep 16, 2014
61e21fe
SPARK-3069 [DOCS] Build instructions in README are outdated
srowen Sep 16, 2014
7b8008f
[SPARK-2182] Scalastyle rule blocking non ascii characters.
ScrapCodes Sep 16, 2014
86d253e
[SPARK-3527] [SQL] Strip the string message
chenghao-intel Sep 16, 2014
9d5fa76
[SPARK-3519] add distinct(n) to PySpark
Sep 16, 2014
7583699
[SPARK-3308][SQL] Ability to read JSON Arrays as tables
yhuai Sep 16, 2014
30f288a
[SPARK-2890][SQL] Allow reading of data when case insensitive resolut…
marmbrus Sep 16, 2014
8e7ae47
[SPARK-2314][SQL] Override collect and take in python library, and co…
staple Sep 16, 2014
df90e81
[Docs] minor punctuation fix
nchammas Sep 16, 2014
84073eb
[SQL][DOCS] Improve section on thrift-server
marmbrus Sep 16, 2014
a9e9104
[SPARK-3546] InputStream of ManagedBuffer is not closed and causes ru…
sarutak Sep 16, 2014
ec1adec
[SPARK-3430] [PySpark] [Doc] generate PySpark API docs using Sphinx
davies Sep 16, 2014
b201712
[SPARK-787] Add S3 configuration parameters to the EC2 deploy scripts
danosipov Sep 16, 2014
a6e1712
Add a Community Projects page
velvia Sep 16, 2014
0a7091e
[SPARK-3555] Fix UISuite race condition
andrewor14 Sep 16, 2014
008a5ed
[Minor]ignore all config files in conf
scwf Sep 17, 2014
983609a
[Docs] Correct spark.files.fetchTimeout default value
viper-kun Sep 17, 2014
7d1a372
SPARK-3177 (on Master Branch)
Sep 17, 2014
8fbd5f4
[Docs] minor grammar fix
nchammas Sep 17, 2014
cbf983b
[SQL][DOCS] Improve table caching section
marmbrus Sep 17, 2014
5044e49
[SPARK-1455] [SPARK-3534] [Build] When possible, run SQL tests only.
nchammas Sep 17, 2014
b3830b2
Docs: move HA subsections to a deeper indentation level
ash211 Sep 17, 2014
7fc3bb7
[SPARK-3534] Fix expansion of testing arguments to sbt
nchammas Sep 17, 2014
cbc0650
[SPARK-3571] Spark standalone cluster mode doesn't work.
sarutak Sep 17, 2014
6688a26
[SPARK-3564][WebUI] Display App ID on HistoryPage
sarutak Sep 17, 2014
1147973
[SPARK-3567] appId field in SparkDeploySchedulerBackend should be vol…
sarutak Sep 17, 2014
3f169bf
[SPARK-3565]Fix configuration item not consistent with document
WangTaoTheTonic Sep 18, 2014
5547fa1
[SPARK-3534] Add hive-thriftserver to SQL tests
nchammas Sep 18, 2014
6772afe
[Minor] rat exclude dependency-reduced-pom.xml
witgo Sep 18, 2014
3447d10
[SPARK-3547]Using a special exit code instead of 1 to represent Class…
WangTaoTheTonic Sep 18, 2014
3ad4176
SPARK-3579 Jekyll doc generation is different across environments.
pwendell Sep 18, 2014
6cab838
[SPARK-3566] [BUILD] .gitignore and .rat-excludes should consider Win…
sarutak Sep 18, 2014
471e6a3
[SPARK-3589][Minor]remove redundant code
WangTaoTheTonic Sep 18, 2014
b3ed37e
[SPARK-3560] Fixed setting spark.jars system property in yarn-cluster…
Victsm Sep 18, 2014
9306297
[Minor Hot Fix] Move a line in SparkSubmit to the right place
andrewor14 Sep 19, 2014
e77fa81
[SPARK-3554] [PySpark] use broadcast automatically for large closure
davies Sep 19, 2014
e76ef5c
[SPARK-3418] Sparse Matrix support (CCS) and additional native BLAS o…
brkyvz Sep 19, 2014
3bbbdd8
[SPARK-2062][GraphX] VertexRDD.apply does not use the mergeFunc
larryxiao Sep 19, 2014
a48956f
MAINTENANCE: Automated closing of pull requests.
pwendell Sep 19, 2014
be0c756
[SPARK-1701] Clarify slice vs partition in the programming guide
Sep 19, 2014
a03e5b8
[SPARK-1701] [PySpark] remove slice terminology from python examples
Sep 19, 2014
fce5e25
[SPARK-3491] [MLlib] [PySpark] use pickle to serialize data in MLlib
davies Sep 19, 2014
2c3cc76
[SPARK-3501] [SQL] Fix the bug of Hive SimpleUDF creates unnecessary …
chenghao-intel Sep 19, 2014
5522151
[SPARK-2594][SQL] Support CACHE TABLE <name> AS SELECT ...
ravipesala Sep 19, 2014
a95ad99
[SPARK-3592] [SQL] [PySpark] support applySchema to RDD of Row
davies Sep 19, 2014
3b9cd13
SPARK-3605. Fix typo in SchemaRDD.
sryza Sep 19, 2014
ba68a51
[SPARK-3485][SQL] Use GenericUDFUtils.ConversionHelper for Simple UDF…
adrian-wang Sep 19, 2014
99b06b6
[Build] Fix passing of args to sbt
nchammas Sep 19, 2014
8af2370
[Docs] Fix outdated docs for standalone cluster
andrewor14 Sep 19, 2014
78d4220
SPARK-3608 Break if the instance tag naming succeeds
vidaha Sep 20, 2014
454981d
initial commit for pySparkStreaming
giwa Jul 9, 2014
b406252
comment PythonDStream.PairwiseDStream
Jul 15, 2014
87438e2
modify dstream.py to fix indent error
Jul 16, 2014
d7b4d6f
added reducedByKey not working yet
Jul 16, 2014
1a0f065
implementing transform function in Python
Jul 16, 2014
17a74c6
modified the code base on comment in https://github.com/tdas/spark/pu…
Jul 16, 2014
494cae5
remove not implemented DStream functions in python
Jul 16, 2014
e1df940
revert pom.xml
Jul 16, 2014
5bac7ec
revert streaming/pom.xml
Jul 16, 2014
d2099d8
sorted the import following Spark coding convention
Jul 16, 2014
224fc5e
add empty line
Jul 16, 2014
bb7ccf3
remove unused import in python
Jul 16, 2014
f746109
initial commit for socketTextStream
Jul 16, 2014
0d1b954
fied input of socketTextDStream
Jul 16, 2014
ccfd214
added doctest for pyspark.streaming.duration
Jul 17, 2014
b31446a
fixed typo of network_workdcount.py
Jul 17, 2014
dc6995d
delete old file
Jul 17, 2014
c455c8d
added reducedByKey not working yet
Jul 16, 2014
6f98e50
reduceByKey is working
Jul 17, 2014
15feea9
edit python sparkstreaming example
Jul 18, 2014
d3ee86a
added count operation but this implementation need double check
Jul 19, 2014
72b9738
fix map function
Jul 20, 2014
bab31c1
clean up code
Jul 20, 2014
0a8bbbb
clean up codes
Jul 20, 2014
678e854
remove waste file
Jul 20, 2014
b1d2a30
Implemented DStream.foreachRDD in the Python API using Py4J callback …
tdas Jul 23, 2014
05e991b
Added missing file
tdas Aug 1, 2014
9ab8952
Added extra line.
tdas Aug 1, 2014
84a9668
tried to restart callback server
Aug 2, 2014
3b498e1
Kill py4j callback server properly
Aug 3, 2014
b349649
Removed the waste line
giwa Aug 3, 2014
3c45cd2
implemented reduce and count function in Dstream
giwa Aug 4, 2014
d2c01ba
clean up examples
giwa Aug 4, 2014
c462bb3
added stop in StreamingContext
giwa Aug 4, 2014
4d40d63
clean up dstream.py
giwa Aug 4, 2014
29c2bc5
initial commit for testcase
giwa Aug 4, 2014
fe648e3
WIP
giwa Aug 4, 2014
8a0fbbc
update comment
giwa Aug 4, 2014
1523b66
WIP
giwa Aug 4, 2014
1df77f5
WIP: added PythonTestInputStream
giwa Aug 5, 2014
9ad6855
WIP
giwa Aug 7, 2014
ce2acd2
WIP added test case
giwa Aug 11, 2014
878bad7
added basic operation test cases
giwa Aug 11, 2014
f21cab3
delete waste file
giwa Aug 11, 2014
3d37822
fixed PEP-008 violation
giwa Aug 11, 2014
253a863
removed unnesessary changes
giwa Aug 11, 2014
bb10956
edited the comment to add more precise description
giwa Aug 11, 2014
270a9e1
added mapValues and flatMapVaules WIP for glom and mapPartitions test
giwa Aug 11, 2014
bcdec33
WIP: solved partitioned and None is not recognized
giwa Aug 14, 2014
ff14070
broke something
giwa Aug 14, 2014
3000b2b
all tests are passed if numSlice is 2 and the numver of each input is…
giwa Aug 15, 2014
13fb44c
basic function test cases are passed
giwa Aug 15, 2014
18c8723
modified streaming test case to add coment
giwa Aug 15, 2014
f76c182
remove waste duplicated code
giwa Aug 15, 2014
74535d4
added saveAsTextFiles and saveAsPickledFiles
giwa Aug 16, 2014
16aa64f
added TODO coments
giwa Aug 16, 2014
e54f986
add comments
giwa Aug 18, 2014
10b5b04
removed wasted print in DStream
giwa Aug 18, 2014
10ab87b
added sparkContext as input parameter in StreamingContext
giwa Aug 18, 2014
5625bdc
added gorupByKey testcase
giwa Aug 18, 2014
c214199
added testcase for combineByKey
giwa Aug 18, 2014
0b99bec
initial commit for pySparkStreaming
giwa Jul 9, 2014
41886c2
comment PythonDStream.PairwiseDStream
Jul 15, 2014
66fcfff
modify dstream.py to fix indent error
Jul 16, 2014
38adf95
added reducedByKey not working yet
Jul 16, 2014
4bcb318
implementing transform function in Python
Jul 16, 2014
247fd74
modified the code base on comment in https://github.com/tdas/spark/pu…
Jul 16, 2014
dd6de81
initial commit for socketTextStream
Jul 16, 2014
f485b1d
fied input of socketTextDStream
Jul 16, 2014
0df7111
delete old file
Jul 17, 2014
58591d2
reduceByKey is working
Jul 17, 2014
98c2a00
added count operation but this implementation need double check
Jul 19, 2014
eb4bf48
fix map function
Jul 20, 2014
6197a11
clean up code
Jul 20, 2014
2ad7bd3
clean up codes
Jul 20, 2014
fe02547
remove waste file
Jul 20, 2014
4f07163
Implemented DStream.foreachRDD in the Python API using Py4J callback …
tdas Jul 23, 2014
54b5358
tried to restart callback server
Aug 2, 2014
88f7506
Kill py4j callback server properly
Aug 3, 2014
1b83354
Removed the waste line
giwa Aug 3, 2014
92e333e
implemented reduce and count function in Dstream
giwa Aug 4, 2014
0b09cff
added stop in StreamingContext
giwa Aug 4, 2014
932372a
clean up dstream.py
giwa Aug 4, 2014
376e3ac
WIP
giwa Aug 4, 2014
1934726
update comment
giwa Aug 4, 2014
019ef38
WIP
giwa Aug 4, 2014
5c04a5f
WIP: added PythonTestInputStream
giwa Aug 5, 2014
bd3ba53
WIP
giwa Aug 7, 2014
9cde7c9
WIP added test case
giwa Aug 11, 2014
b3b0362
added basic operation test cases
giwa Aug 11, 2014
99410be
delete waste file
giwa Aug 11, 2014
c1d546e
fixed PEP-008 violation
giwa Aug 11, 2014
af610d3
removed unnesessary changes
giwa Aug 11, 2014
953deb0
edited the comment to add more precise description
giwa Aug 11, 2014
f67cf57
added mapValues and flatMapVaules WIP for glom and mapPartitions test
giwa Aug 11, 2014
1e126bf
WIP: solved partitioned and None is not recognized
giwa Aug 14, 2014
795b2cd
broke something
giwa Aug 14, 2014
8dcda84
all tests are passed if numSlice is 2 and the numver of each input is…
giwa Aug 15, 2014
c5ecfc1
basic function test cases are passed
giwa Aug 15, 2014
2a06cdb
remove waste duplicated code
giwa Aug 15, 2014
99ce042
added saveAsTextFiles and saveAsPickledFiles
giwa Aug 16, 2014
ddd4ee1
added TODO coments
giwa Aug 16, 2014
af336b7
add comments
giwa Aug 18, 2014
455e5af
removed wasted print in DStream
giwa Aug 18, 2014
58e41ff
merge with master
giwa Aug 18, 2014
e80647e
adopted the latest compression way of python command
giwa Aug 19, 2014
c00e091
change test case not to use awaitTermination
giwa Aug 19, 2014
3166d31
clean up
giwa Aug 20, 2014
f198d14
clean up code
giwa Aug 21, 2014
b171ec3
fixed pep8 violation
giwa Aug 21, 2014
f04882c
clen up examples
giwa Aug 21, 2014
62dc7a3
clean up exmples
giwa Aug 21, 2014
7dc7391
fixed typo
giwa Aug 21, 2014
6ae3caa
revert pom.xml
giwa Aug 21, 2014
fa4af88
remove duplicated import
giwa Aug 21, 2014
066ba90
revert pom.xml
giwa Aug 21, 2014
8ed93af
fixed explanaiton
giwa Aug 21, 2014
fbed8da
revert pom.xml
giwa Aug 21, 2014
bebb3f3
remove the last brank line
giwa Aug 21, 2014
b0f2015
added comment in dstream._test_output
giwa Aug 21, 2014
f385976
delete inproper comments
giwa Aug 21, 2014
c0a06bc
delete not implemented functions
giwa Aug 21, 2014
2fdf0de
Fix scalastyle errors
Aug 26, 2014
d542743
clean up code
giwa Aug 31, 2014
d39f102
added StreamingContext.remember
giwa Aug 31, 2014
63c881a
added StreamingContext.sparkContext
giwa Aug 31, 2014
d5f5fcb
added comment for StreamingContext.sparkContext
giwa Aug 31, 2014
8ffdbf1
added atexit to handle callback server
giwa Aug 31, 2014
4a59e1e
WIP:added more test for StreamingContext
giwa Aug 31, 2014
2d32a74
added some StreamingContextTestSuite
giwa Sep 1, 2014
e685853
meged with rebased 1.1 branch
giwa Sep 20, 2014
5cdb6fa
changed for SCCallSiteSync
giwa Sep 21, 2014
550dfd9
WIP fixing 1.1 merge
giwa Sep 21, 2014
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
13 changes: 7 additions & 6 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,4 +1,6 @@
*~
*.#*
*#*#
*.swp
*.ipr
*.iml
Expand All @@ -15,11 +17,11 @@ out/
third_party/libmesos.so
third_party/libmesos.dylib
conf/java-opts
conf/spark-env.sh
conf/streaming-env.sh
conf/log4j.properties
conf/spark-defaults.conf
conf/hive-site.xml
conf/*.sh
conf/*.cmd
conf/*.properties
conf/*.conf
conf/*.xml
docs/_site
docs/api
target/
Expand Down Expand Up @@ -50,7 +52,6 @@ unit-tests.log
/lib/
rat-results.txt
scalastyle.txt
conf/*.conf
scalastyle-output.xml

# For Hive
Expand Down
3 changes: 3 additions & 0 deletions .rat-excludes
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ log4j.properties.template
metrics.properties.template
slaves
spark-env.sh
spark-env.cmd
spark-env.sh.template
log4j-defaults.properties
bootstrap-tooltip.js
Expand All @@ -31,6 +32,7 @@ sorttable.js
.*data
.*log
cloudpickle.py
heapq3.py
join.py
SparkExprTyper.scala
SparkILoop.scala
Expand All @@ -57,3 +59,4 @@ dist/*
.*iws
logs
.*scalastyle-output.xml
.*dependency-reduced-pom.xml
12 changes: 12 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
## Contributing to Spark

Contributions via GitHub pull requests are gladly accepted from their original
author. Along with any pull requests, please state that the contribution is
your original work and that you license the work to the project under the
project's open source license. Whether or not you state this explicitly, by
submitting any copyrighted material via pull request, email, or other means
you agree to license the material under the project's open source license and
warrant that you have the legal authority to do so.

Please see the [Contributing to Spark wiki page](https://cwiki.apache.org/SPARK/Contributing+to+Spark)
for more information.
283 changes: 283 additions & 0 deletions LICENSE
Original file line number Diff line number Diff line change
Expand Up @@ -338,6 +338,289 @@ THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

========================================================================
For heapq (pyspark/heapq3.py):
========================================================================

# A. HISTORY OF THE SOFTWARE
# ==========================
#
# Python was created in the early 1990s by Guido van Rossum at Stichting
# Mathematisch Centrum (CWI, see http://www.cwi.nl) in the Netherlands
# as a successor of a language called ABC. Guido remains Python's
# principal author, although it includes many contributions from others.
#
# In 1995, Guido continued his work on Python at the Corporation for
# National Research Initiatives (CNRI, see http://www.cnri.reston.va.us)
# in Reston, Virginia where he released several versions of the
# software.
#
# In May 2000, Guido and the Python core development team moved to
# BeOpen.com to form the BeOpen PythonLabs team. In October of the same
# year, the PythonLabs team moved to Digital Creations (now Zope
# Corporation, see http://www.zope.com). In 2001, the Python Software
# Foundation (PSF, see http://www.python.org/psf/) was formed, a
# non-profit organization created specifically to own Python-related
# Intellectual Property. Zope Corporation is a sponsoring member of
# the PSF.
#
# All Python releases are Open Source (see http://www.opensource.org for
# the Open Source Definition). Historically, most, but not all, Python
# releases have also been GPL-compatible; the table below summarizes
# the various releases.
#
# Release Derived Year Owner GPL-
# from compatible? (1)
#
# 0.9.0 thru 1.2 1991-1995 CWI yes
# 1.3 thru 1.5.2 1.2 1995-1999 CNRI yes
# 1.6 1.5.2 2000 CNRI no
# 2.0 1.6 2000 BeOpen.com no
# 1.6.1 1.6 2001 CNRI yes (2)
# 2.1 2.0+1.6.1 2001 PSF no
# 2.0.1 2.0+1.6.1 2001 PSF yes
# 2.1.1 2.1+2.0.1 2001 PSF yes
# 2.2 2.1.1 2001 PSF yes
# 2.1.2 2.1.1 2002 PSF yes
# 2.1.3 2.1.2 2002 PSF yes
# 2.2.1 2.2 2002 PSF yes
# 2.2.2 2.2.1 2002 PSF yes
# 2.2.3 2.2.2 2003 PSF yes
# 2.3 2.2.2 2002-2003 PSF yes
# 2.3.1 2.3 2002-2003 PSF yes
# 2.3.2 2.3.1 2002-2003 PSF yes
# 2.3.3 2.3.2 2002-2003 PSF yes
# 2.3.4 2.3.3 2004 PSF yes
# 2.3.5 2.3.4 2005 PSF yes
# 2.4 2.3 2004 PSF yes
# 2.4.1 2.4 2005 PSF yes
# 2.4.2 2.4.1 2005 PSF yes
# 2.4.3 2.4.2 2006 PSF yes
# 2.4.4 2.4.3 2006 PSF yes
# 2.5 2.4 2006 PSF yes
# 2.5.1 2.5 2007 PSF yes
# 2.5.2 2.5.1 2008 PSF yes
# 2.5.3 2.5.2 2008 PSF yes
# 2.6 2.5 2008 PSF yes
# 2.6.1 2.6 2008 PSF yes
# 2.6.2 2.6.1 2009 PSF yes
# 2.6.3 2.6.2 2009 PSF yes
# 2.6.4 2.6.3 2009 PSF yes
# 2.6.5 2.6.4 2010 PSF yes
# 2.7 2.6 2010 PSF yes
#
# Footnotes:
#
# (1) GPL-compatible doesn't mean that we're distributing Python under
# the GPL. All Python licenses, unlike the GPL, let you distribute
# a modified version without making your changes open source. The
# GPL-compatible licenses make it possible to combine Python with
# other software that is released under the GPL; the others don't.
#
# (2) According to Richard Stallman, 1.6.1 is not GPL-compatible,
# because its license has a choice of law clause. According to
# CNRI, however, Stallman's lawyer has told CNRI's lawyer that 1.6.1
# is "not incompatible" with the GPL.
#
# Thanks to the many outside volunteers who have worked under Guido's
# direction to make these releases possible.
#
#
# B. TERMS AND CONDITIONS FOR ACCESSING OR OTHERWISE USING PYTHON
# ===============================================================
#
# PYTHON SOFTWARE FOUNDATION LICENSE VERSION 2
# --------------------------------------------
#
# 1. This LICENSE AGREEMENT is between the Python Software Foundation
# ("PSF"), and the Individual or Organization ("Licensee") accessing and
# otherwise using this software ("Python") in source or binary form and
# its associated documentation.
#
# 2. Subject to the terms and conditions of this License Agreement, PSF hereby
# grants Licensee a nonexclusive, royalty-free, world-wide license to reproduce,
# analyze, test, perform and/or display publicly, prepare derivative works,
# distribute, and otherwise use Python alone or in any derivative version,
# provided, however, that PSF's License Agreement and PSF's notice of copyright,
# i.e., "Copyright (c) 2001, 2002, 2003, 2004, 2005, 2006, 2007, 2008, 2009, 2010,
# 2011, 2012, 2013 Python Software Foundation; All Rights Reserved" are retained
# in Python alone or in any derivative version prepared by Licensee.
#
# 3. In the event Licensee prepares a derivative work that is based on
# or incorporates Python or any part thereof, and wants to make
# the derivative work available to others as provided herein, then
# Licensee hereby agrees to include in any such work a brief summary of
# the changes made to Python.
#
# 4. PSF is making Python available to Licensee on an "AS IS"
# basis. PSF MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
# IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, PSF MAKES NO AND
# DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
# FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON WILL NOT
# INFRINGE ANY THIRD PARTY RIGHTS.
#
# 5. PSF SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
# FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
# A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON,
# OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
#
# 6. This License Agreement will automatically terminate upon a material
# breach of its terms and conditions.
#
# 7. Nothing in this License Agreement shall be deemed to create any
# relationship of agency, partnership, or joint venture between PSF and
# Licensee. This License Agreement does not grant permission to use PSF
# trademarks or trade name in a trademark sense to endorse or promote
# products or services of Licensee, or any third party.
#
# 8. By copying, installing or otherwise using Python, Licensee
# agrees to be bound by the terms and conditions of this License
# Agreement.
#
#
# BEOPEN.COM LICENSE AGREEMENT FOR PYTHON 2.0
# -------------------------------------------
#
# BEOPEN PYTHON OPEN SOURCE LICENSE AGREEMENT VERSION 1
#
# 1. This LICENSE AGREEMENT is between BeOpen.com ("BeOpen"), having an
# office at 160 Saratoga Avenue, Santa Clara, CA 95051, and the
# Individual or Organization ("Licensee") accessing and otherwise using
# this software in source or binary form and its associated
# documentation ("the Software").
#
# 2. Subject to the terms and conditions of this BeOpen Python License
# Agreement, BeOpen hereby grants Licensee a non-exclusive,
# royalty-free, world-wide license to reproduce, analyze, test, perform
# and/or display publicly, prepare derivative works, distribute, and
# otherwise use the Software alone or in any derivative version,
# provided, however, that the BeOpen Python License is retained in the
# Software, alone or in any derivative version prepared by Licensee.
#
# 3. BeOpen is making the Software available to Licensee on an "AS IS"
# basis. BEOPEN MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
# IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, BEOPEN MAKES NO AND
# DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
# FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF THE SOFTWARE WILL NOT
# INFRINGE ANY THIRD PARTY RIGHTS.
#
# 4. BEOPEN SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF THE
# SOFTWARE FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS
# AS A RESULT OF USING, MODIFYING OR DISTRIBUTING THE SOFTWARE, OR ANY
# DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
#
# 5. This License Agreement will automatically terminate upon a material
# breach of its terms and conditions.
#
# 6. This License Agreement shall be governed by and interpreted in all
# respects by the law of the State of California, excluding conflict of
# law provisions. Nothing in this License Agreement shall be deemed to
# create any relationship of agency, partnership, or joint venture
# between BeOpen and Licensee. This License Agreement does not grant
# permission to use BeOpen trademarks or trade names in a trademark
# sense to endorse or promote products or services of Licensee, or any
# third party. As an exception, the "BeOpen Python" logos available at
# http://www.pythonlabs.com/logos.html may be used according to the
# permissions granted on that web page.
#
# 7. By copying, installing or otherwise using the software, Licensee
# agrees to be bound by the terms and conditions of this License
# Agreement.
#
#
# CNRI LICENSE AGREEMENT FOR PYTHON 1.6.1
# ---------------------------------------
#
# 1. This LICENSE AGREEMENT is between the Corporation for National
# Research Initiatives, having an office at 1895 Preston White Drive,
# Reston, VA 20191 ("CNRI"), and the Individual or Organization
# ("Licensee") accessing and otherwise using Python 1.6.1 software in
# source or binary form and its associated documentation.
#
# 2. Subject to the terms and conditions of this License Agreement, CNRI
# hereby grants Licensee a nonexclusive, royalty-free, world-wide
# license to reproduce, analyze, test, perform and/or display publicly,
# prepare derivative works, distribute, and otherwise use Python 1.6.1
# alone or in any derivative version, provided, however, that CNRI's
# License Agreement and CNRI's notice of copyright, i.e., "Copyright (c)
# 1995-2001 Corporation for National Research Initiatives; All Rights
# Reserved" are retained in Python 1.6.1 alone or in any derivative
# version prepared by Licensee. Alternately, in lieu of CNRI's License
# Agreement, Licensee may substitute the following text (omitting the
# quotes): "Python 1.6.1 is made available subject to the terms and
# conditions in CNRI's License Agreement. This Agreement together with
# Python 1.6.1 may be located on the Internet using the following
# unique, persistent identifier (known as a handle): 1895.22/1013. This
# Agreement may also be obtained from a proxy server on the Internet
# using the following URL: http://hdl.handle.net/1895.22/1013".
#
# 3. In the event Licensee prepares a derivative work that is based on
# or incorporates Python 1.6.1 or any part thereof, and wants to make
# the derivative work available to others as provided herein, then
# Licensee hereby agrees to include in any such work a brief summary of
# the changes made to Python 1.6.1.
#
# 4. CNRI is making Python 1.6.1 available to Licensee on an "AS IS"
# basis. CNRI MAKES NO REPRESENTATIONS OR WARRANTIES, EXPRESS OR
# IMPLIED. BY WAY OF EXAMPLE, BUT NOT LIMITATION, CNRI MAKES NO AND
# DISCLAIMS ANY REPRESENTATION OR WARRANTY OF MERCHANTABILITY OR FITNESS
# FOR ANY PARTICULAR PURPOSE OR THAT THE USE OF PYTHON 1.6.1 WILL NOT
# INFRINGE ANY THIRD PARTY RIGHTS.
#
# 5. CNRI SHALL NOT BE LIABLE TO LICENSEE OR ANY OTHER USERS OF PYTHON
# 1.6.1 FOR ANY INCIDENTAL, SPECIAL, OR CONSEQUENTIAL DAMAGES OR LOSS AS
# A RESULT OF MODIFYING, DISTRIBUTING, OR OTHERWISE USING PYTHON 1.6.1,
# OR ANY DERIVATIVE THEREOF, EVEN IF ADVISED OF THE POSSIBILITY THEREOF.
#
# 6. This License Agreement will automatically terminate upon a material
# breach of its terms and conditions.
#
# 7. This License Agreement shall be governed by the federal
# intellectual property law of the United States, including without
# limitation the federal copyright law, and, to the extent such
# U.S. federal law does not apply, by the law of the Commonwealth of
# Virginia, excluding Virginia's conflict of law provisions.
# Notwithstanding the foregoing, with regard to derivative works based
# on Python 1.6.1 that incorporate non-separable material that was
# previously distributed under the GNU General Public License (GPL), the
# law of the Commonwealth of Virginia shall govern this License
# Agreement only as to issues arising under or with respect to
# Paragraphs 4, 5, and 7 of this License Agreement. Nothing in this
# License Agreement shall be deemed to create any relationship of
# agency, partnership, or joint venture between CNRI and Licensee. This
# License Agreement does not grant permission to use CNRI trademarks or
# trade name in a trademark sense to endorse or promote products or
# services of Licensee, or any third party.
#
# 8. By clicking on the "ACCEPT" button where indicated, or by copying,
# installing or otherwise using Python 1.6.1, Licensee agrees to be
# bound by the terms and conditions of this License Agreement.
#
# ACCEPT
#
#
# CWI LICENSE AGREEMENT FOR PYTHON 0.9.0 THROUGH 1.2
# --------------------------------------------------
#
# Copyright (c) 1991 - 1995, Stichting Mathematisch Centrum Amsterdam,
# The Netherlands. All rights reserved.
#
# Permission to use, copy, modify, and distribute this software and its
# documentation for any purpose and without fee is hereby granted,
# provided that the above copyright notice appear in all copies and that
# both that copyright notice and this permission notice appear in
# supporting documentation, and that the name of Stichting Mathematisch
# Centrum or CWI not be used in advertising or publicity pertaining to
# distribution of the software without specific, written prior
# permission.
#
# STICHTING MATHEMATISCH CENTRUM DISCLAIMS ALL WARRANTIES WITH REGARD TO
# THIS SOFTWARE, INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND
# FITNESS, IN NO EVENT SHALL STICHTING MATHEMATISCH CENTRUM BE LIABLE
# FOR ANY SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
# WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
# ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT
# OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.

========================================================================
For sorttable (core/src/main/resources/org/apache/spark/ui/static/sorttable.js):
Expand Down
Loading