DriverPack Solution Download Full ISO

alihamzakpa093 0 views 22 slides Apr 15, 2025
Slide 1
Slide 1 of 22
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22

About This Presentation

🌍📱👉COPY LINK & PASTE ON GOOGLE https://9to5mac.org/after-verification-click-go-to-download-page👈
This version has many new features and is optimized for the new operating system, Windows 10. Compared with the previous version of the DRP 17.0, there are many fixed shells and utterly n...


Slide Content

1
GNNs at Scale With Graph Data Science Sampling
and GDS Python Client Integration
Adam Schill Collberg
Senior Software Engineer, Graph Data Science Team at Neo4j

Overview
•Problem Statement
◦Cora Dataset
◦Node Classification
•Using Graph Neural Networks (GNN)
◦A VeryBrief Introduction
◦PyTorch and PyG
◦GNN Pros and Cons
•Graph Sampling
◦Random Walk with Restarts (RWR)
◦Neo4j Graph Data Science (GDS)
•Demo
•Summary and Further Learning
2

3
Problem Statement
Classifying subjects of research papers in a citation network
3

The Cora Dataset
•A research paper citation network
•2708 research Papernodes
•5429 CITESrelationships
Paper Paper
CITES
Schema:
Paper nodes color
coded by subject
4
“equation”
“turing”
“graph”

Paper X
1
0
0

Feature vector
•Each paper node has a length 1433 binary
feature vector
◦Each dimension represents a keyword
◦Value 1 if paper has keyword, 0 otherwise
•Each paper node belongs to a subject

The Node Classification Problem
Computer Science
Mathematics
Paper 1
Paper 2
Feature vectors
5

Supervised Machine Learning Approach

Supervised ML
model
Eg. neural network
Paper subject
Feature Vector Representation
But what about all the relationship information?
“equation”
“turing”
“graph”

Paper 1
1
0
0

Paper 2
0
1
1

6

Degree could be interesting?
Maybe interesting, but can we do better?

Supervised ML
model
Eg. neural network
Paper subject
Feature Vector Representation
“equation”
“turing”
“graph”

Node degree
Paper 1
1
0
0

82
Paper 2
0
1
1

3
7

8
Using Graph Neural Networks
Graph topology-aware deep machine learning
8

A VeryBrief Intro to GNNs
A neural network architecture based on message passing, where:
●Inputs are node feature vectors (same as regular ML)
●Layer connectivity is defined by relationships
Example:
Each target nodes gets its own computation graph, but node vectors and weights are shared
Layer 0:
Node feature
vectors
Layer 1, outputs
transformed vectors
Layer 2
GNN computation graph
for A
9

The GNN layer
The GNN layer abstractly consists of two parts:
1.Aggregating neighbor output vectors from previous layer
2.Updating target node vector with neural network
Graph Convolution Network (GCN) example:
Layer i
x
C(i -1)
x
A(i -1)
x
B(i)
10
where σ is non-linear activation and W(i) weight matrix

PyTorch and PyG
●An optimized tensor library for deep learning using
GPUs and CPUs
●Originally developed by Meta AI, now part of the Linux
Foundation umbrella
●Rich ecosystem
●Interfaces in both Python and C++
●PyTorch Geometric (PyG):
○Extension for graph learning
○Supports lots of GNN architectures
11

12
GNN ConsGNN Pros
•Requires a lot of memory
•Are slow to compute
◦Even with GPUs
•Hard to interpret
•Fairly complicated to
implement
•Leverage topology through
message passing
•Can capture lots of
information in weights
•Are inductive:
◦Can be trained on one graph
and applied for prediction on
another (similar) graph
Let’s train a GNN on a graph subsample!

13
Graph Sampling
Random walk with restarts and Neo4j Graph Data Science
13

Random Walk with Restarts (RWR) sampling
The algorithm:
Take random walks in the graph, but keep restarting from the
same node root intermittently
When enough of the graph visited, output visited graph
Shown by Leskovec et al. [*] to produce structurally
representative subgraphs
[*]: Sampling from Large Graphs, Leskovec et al, 2006, Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
14

Neo4j Graph Data Science (GDS) Library
●A Neo4j DB plugin for analytics
○Part of the DBMS (server) process
○Projects database into memory for analysis
●Provides high performance graph algorithms
○Running at scale (100s of billions of nodes)
○Has a nice implementation of RWR sampling
●Supports rapid transfer to and from the library
○Using the Apache Arrow memory format and
libraries
15
Client side
Server side
Neo4j GDS
Neo4j
Driver
JVM
Bolt
Arrow
Client
Arrow
But there is an easier way…

Neo4j Graph Data Science Client
●A Pythonic surface for GDS
●Wraps Neo4j Python driver
●Looks very similar to the GDS Cypher API
●Adds additional convenience functionality
●Uses Arrow client for fast data transfer
seamlessly under the hood
●pip install graphdatascience
16
Client side
Server side
Neo4j GDS
GDS Client
JVM
Bolt Arrow

17
Demo
Jupyter notebook time!
17

18
Summary and Further Learning
What did we learn? And what’s next?
18

What did we learn?
●Node classification is an interesting graph ML problem
●It’s useful to leverage topology when learning on graphs
●GNNs can do this well, but are slow
●Since GNNs are inductive we can train them on a subsample
●We can use PyG for GNN training
●We can subsample using RWR
●We can use GDS (including its Python client) for RWR
●GDS’s rapid transfer capabilities enable a fast workflow
19

Also at NODES
If you liked this presentation, you might also like
•Fundamentals of Neo4j Graph Data Science Series 2.x –Pipelines and more
◦Mats Rydberg
◦Thursday, November 17, 9:40 –10:25 CET
•Link Prediction With Graph Data Science at Scale
◦Florentin Dörre
◦Thursday, November 17, 13:40 –13:55 CET

A Bunch of Links
●The DEMO notebook
●Neo4j Graph Data Science Manual
●Neo4j Graph Data Science Client Manual
●PyTorch Docs
●PyTorch Geometric Docs
●The Cora dataset
●GCN paper
●Sampling from Large Graphs paper
●Graph Representation Learning book
●Apache Arrow
21

22
Thank you!
Contact me at
[email protected]
adamnsch@Github