# Numpy Point Cloud

- Led the Back-end software development using ROS and C++ to setup point cloud processing algorithms which handled the data collection. Now you can use the power and performance of PCL from the comfort of Python. In the following snippet of Python code, we access the intensity data of the point cloud using an numpy. pyplot as plt >>> from pygsp import graphs, filters, plotting, utils Then we can load a graph. >>> import numpy as np >>> import matplotlib. memory access to these data ﬁelds via a numpy array. Anaconda Cloud. A 3-d point cloud viewer that accepts any 3-column numpy array as input, renders tens of millions of points interactively using an octree-based level of detail mechanism,. This can be useful for plotting point clouds where each segment color is unique. PyMesh — Geometry Processing Library for Python¶. point_cloud: numpy. MR84 Installation Guide - Cisco Meraki. The method to use to create points. Get rotation of noisy rectangular 2d point cloud with pca I have a set of point clouds in nd space. Nov 29, 2018. This module allows reading and writing RenderMan point cloud files. 以后发现有错再改,卷积层在cpu上训练速度很慢 博文 来自： dx888888的博客. The following are code examples for showing how to use cv2. Point cloud from 2D image. sqrt (2) # Number of cells in the x- and y-directions of the grid nx, ny = int (width / a) + 1, int. 50; HOT QUESTIONS. py is to provide a pleasant Python interface for creating figure specifications for display in the Plotly. The libraries are matplotlib, wordcloud, numpy. This should be a n-dimensional array (m x n) containing a set of coordinates (n) over a set of points (m). 6 (pip install pclpy), I'm working on linux compatibility. for extracting ground points from unclassified point clouds, and may also be useful for detailed vegetation height measurements in forestry or rangeland ecology. However I wanted to try the octrees as an alternative data structure for other task like downsampling. array function. A particular focus of the rowan package is working with unit quaternions, which are a popular means of. Keep in mind that this sort of surface-fitting works better if you have a bit more than just 6 data points. cluster_ranges_centroids (x, lab[, weights]) Computes the cluster indices and centroids of a (weighted) point cloud with labels. meshgrid (*xi, **kwargs) [source] ¶ Return coordinate matrices from coordinate vectors. import math. PointCloud2(). They formulate the registration as a probability density estimation problem, where one point cloud is represented using a Gaussian Mixture Model (GMM) and the other point cloud is observations from said GMM. uniform(0, 1, numberOfValues). ndarray instances using the Mesh. Returns: A boolean indicating whether v0, v1 and v2 are colinear. Lets say the points are in 2-d space, so each point can be represented with the triplet (x, y, v). It is not enough if you want to get to small details. Python numpy. In order to empower my customers' businesses, performances, online campaigns and teams productivity, we build and deploy processes, algorithms and tools. Get the training you need to stay ahead with expert-led courses on NumPy. Module reference¶. initial_angle (float, optional): angle of initial set of weights [deg]. unpack is speed especially for large point clouds, this will be faster. matlab - Offline point cloud creation from Kinect V2 RGB and Depth images I have a saved set of data captured with a Kinect V2 using the Kinect SDK. You can use the rasterio library combined with numpy and matplotlib to open, manipulate and plot raster data in Python. Important: The input parameter to PCA 's constructor is a numpy array. References¶. You can vote up the examples you like or vote down the ones you don't like. DISTANCE —The tool will use the Distance parameter to place points at fixed distances along the features. ←Home About CV Subscribe Least Squares Sphere Fit September 13, 2015. Simply provide a list of points and optional per-point colors and normals. Persistence graphical tools user manual¶ Definition ¶ These graphical tools comes on top of persistence results and allows the user to build easily persistence barcode, diagram or density. @wkentaro for some minor changes. Solves the equation by computing a vector x that minimizes the squared Euclidean 2-norm. Point Cloud is a heavily templated API, and consequently mapping this into Python using Cython is challenging. Point cloud from 2D image. It's simple, reliable, and hassle-free. PolyData class and can easiy have scalar/vector data arrays associated with the point cloud. Anaconda Community Open Source NumFOCUS Support Developer Blog. To do so, we will need to learn how we can "project" a 3D point onto the surface of a 2D drawable surface (which we will call in this lesson, a canvas) using some simple geometry rules. Update: 2018-04-22 I've uploaded the data and a demo Python file here. NumPy Terminal Online - The best online IDE and Terminals in the cloud where you can Edit, Compile, Execute and Share your source code with the help of simple clicks. In our code, kernel size we're using increased from 1x1 to 49x49. I'm looking for a VOP or instructions for building a VOP network (or something) to provide the unit vector/slope of a best fit line for a local point cloud. draw_geometries visualizes the point cloud. Cloud desktops, terminals, and servers. meshgrid (*xi, **kwargs) [source] ¶ Return coordinate matrices from coordinate vectors. Package, install, and use your code anywhere. Default is 'tail' normalize: [False | True] When True, all of the arrows will be the same length. Therefore, the mean and variance of the weighted sums of random variables are their weighted sums. 7M point cloud room view from above with colors removed. In other words, we can define a ndarray as the collection of the data type (dtype) objects. We simply need to recover the dimensions of the grid that these points make and then we can generate a pyvista. Converting depth map into an editable 3D point cloud. ptp(a, axis=None, out=None) a : array containing numbers whose range is required axis : axis or axes along which the range is computed, default is to compute the range of the flattened array. # -*- coding: utf-8 -*-# transformations. The problem with mplot3d objects is that they are point based, whereas stl files use triangles instead. 09*10²⁰ array. A set of Python modules which makes it easy to write lidar processing code in Python. 5 Calculate X,Y and Z coordinates of point cloud "SL3DS5. My view is that all NumPy (and SciPy and Scikit) functions should be multi-methods that dispatch based on Python-type *and* then additionally for memory-view-like objects on the data-type of the elements. This post was inspired by an answer by user D. Sources of inspiration may be found in the Example gallery, with example Python code. ShapeNetPart dataset. [1] some feature descriptor that can be used to characterize a speciﬁc point relative to the rest of the points in the point cloud. Animation that shows the general process of taking lidar point clouds and converting them to a Raster Format. However I wanted to try the octrees as an alternative data structure for other task like downsampling. For NumPy, what is needed is a two-tiered dispatch mechanism. Menpo types store the minimal amount of data possible. This however is no different than creating a PyVista mesh with your own NumPy arrays. The second submatrix is used to calculate the column means on and should be derived from datamatrix filtered by the row names and column names of a specific cluster. However, pptk viewer may fail to start for larger inputs (the actual input size depends on system and GPU memory; on certain machines this is known to happen for inputs larger than roughly 100M points). The following are code examples for showing how to use cv2. To implement the algorithm, we will start by defining a dataset to work with. Apr 27, 2018 · 6 min read. 7M point cloud room view from above with colors removed. Clean Remote Sensing Data in Python - Clouds, Shadows & Cloud Masks. draw_geometries visualizes the point cloud. I want to downsample this into a 2D grid of mean height values - to do this I want to split the data into 5x5 X-Y bins and calculate the mean height value (Z coordinate) in each bin. This attribute is internally represented as a pandas DataFrame. This centers the point cloud about the origin. Qhull implements the Quickhull algorithm for computing the convex hull. The coordinate system framework is designed to allow users to add their own coordinate systems easily, if desired, and implement transformations between theirs and the builtin coordinate systems. target (numpy. Generating an OpenFOAM mesh from a point cloud (x,y,z) I was recently provided a file containing x,y,z points defining the bed elevation extracted from a flume study. grasping¶ Classes for parallel-jaw grasping and robust grasp quality evaluation. Therefore, the above examples proves the point as to why you should go for python numpy array rather than a list! Moving forward in python numpy tutorial, let’s focus on some of its operations. ORB-SLAM2 can construct sparse map with high speed, while OpenSfM can generate high-quality dense point cloud with relatively. This module allows reading and writing RenderMan point cloud files. The point cloud data should be represented as a numpy array with N rows, and at least 3 columns. When discussing a rotation, there are two possible conventions: rotation of the axes, and rotation of the object relative to fixed axes. QGn and QG-n. Find minimum oriented bounding box of point cloud (C++ and PCL) Here we're trying to get the minimum oriented bounding box of a point cloud using C++ and the Point Cloud Library (PCL). Qhull computes the convex hull, Delaunay triangulation, Voronoi diagram, halfspace intersection about a point, furthest-site Delaunay triangulation, and furthest-site Voronoi diagram. It is not enough if you want to get to small details. pdist (X, metric='euclidean', *args, **kwargs) [source] ¶ Pairwise distances between observations in n-dimensional space. The image is 640x480, and is a NumPy array of bytes. sqrt (2) # Number of cells in the x- and y-directions of the grid nx, ny = int (width / a) + 1, int. I need to match the coordinates in my shapefile to the coordinates in my point cloud file and extract the point cloud classification data. numpy_pc_2009Jun02_181409. NOTE: you may want to use use scipy. add point cloud file name to project 3. Online Python IDE - The best online IDE and Terminals in the cloud where you can Edit, Compile, Execute and Share your source code with the help of simple clicks. R is an absolute last resort if the tools I'm looking for aren't implemented elsewhere. This class provides an index into a set of k-dimensional points which can be used to rapidly look up the nearest neighbors of any point. For 2-D problems, k is a column vector of point indices representing the sequence of points around the boundary, which is a polygon. If you need OpenCV, you can install it with: 1 pip install opencv-python. , an award-winning firm specializing in geospatial technology integration and sensor engineering for NASA, FEMA, NOAA, the US Navy, and many other commercial and non-profit organizations. Use mouse/trackpad to see the geometry from different view point. This centers the point cloud about the origin. Density is pretty even which indicates nice rotary stage performance and correct calculation. ptp(a, axis=None, out=None) a : array containing numbers whose range is required axis : axis or axes along which the range is computed, default is to compute the range of the flattened array. I see there is a volumeaverage function, but am not sure how to limit the search to a specified area. active stereo, and relation to structured light. normal_data ( numpy. Learn programming, marketing, data science and more. lstsq (a, b, rcond='warn') [source] ¶ Return the least-squares solution to a linear matrix equation. It tries to decode the file based on the extension name. Tuple[numpy. I am using a Kinect Vision sensor. The algorithm that is very powerful and is very widely used both within industry and academia is called the support vector machine. I have a numpy array comprised of LAS data [x, y, z, intensity, classification]. ndarray of float) - An dim x #elements array that contains the points in the cloud. 点云的数据格式如下（1-15），其中前三列为x,y,z的坐标，我们取用前三列，第四列可以忽略：. Point cloud data can be created using lasers, radar waves, acoustic soundings, or other waveform generation devices. I would like to read the points (I am using a numpy array), and filter out classes 1 and 2 Stack Exchange Network Stack Exchange network consists of 175 Q&A communities including Stack Overflow , the largest, most trusted online community for developers to learn, share their knowledge, and build their careers. But I’d like it to quit this scaling when I zoom in. With the Python console in VeloView, users can access point cloud data and attribute arrays and use NumPy to perform advanced data analysis. LEARNING WITH lynda. QGIS plugins web portal. Numpy Special Functions This Python Numpy. Introduction ORB-SLAM2 and OpenSfM are two methods of constructing point clouds in a certain area. pyntcloud is a Python 3 library for working with 3D point clouds leveraging the power of the Python scientific stack. 88793314]] Downsample the point cloud with a voxel of 0. Change this variable to modify or replace the filter function. The code for compressed point cloud data was informed by looking at Matlab PCL. Function to compute the mean and covariance matrix of a point cloud. a p-value of. やりたいこと Depthセンサで取得したデータをOpen3Dで自由自在に操りたい 教科書 Open3D: A Modern Library for 3D Data Processing — Open3D 0. You can save your projects at Dropbox, GitHub, GoogleDrive and OneDrive to be accessed anywhere and any time. py -h will give some information about the usage. The user can then navigate around the point cloud using their keyboard. I only say purportedly, as I haven't verified, but I assume this to be quite true. I tried scipy. It is a line-structured point cloud over which I perform interpolation with scipy. This doesn't work though for an obvious reason: the principle components of a (mostly) rectangular point cloud are in the direction of its corners, not its sides. Packages ; NumPy-based implementation of Fast Fourier Transform using Intel (R) Math Kernel Library. No native GPU acceleration (BTW can be used in python/numpy filters) Why PDALtools. You can vote up the examples you like or vote down the ones you don't like. (switched from home made math classes). In this lab, you will use the What-if Tool to analyze and compare two different models deployed on Cloud AI Platform. Sampling a point cloud on a 3D mesh. This is the main entry point for people interested in doing 3D plotting à la Matlab or IDL in Python. We are going to use a couple of dependencies to work with the point cloud presented in the KITTI dataset: apart from the familiar toolset of numpy and matplotlib we will use pykitti. Machine Learning. Use mouse/trackpad to see the geometry from different view point. paraview_interface as pv # 3次元可視化用 import numpy as np import matplotlib. Abstract: Point set registration is a key component in many computer vision tasks. If the GUI does not offer enough flexibility, you may always write your own Python code. ) and creation of PointCloud objects from numpy arrays - some extras like laspy reading/writing Right now, it works only if you're on windows x64, using python 3. Introduction. dense 3-D point cloud, (3) earthquake surface ruptures of the Greendale Fault associated with the Mw7. 3d plane to point cloud fitting using SVD. Build intelligence in to your own application with a full GPU cloud. PowerShell Terminal Online - The best online IDE and Terminals in the cloud where you can Edit, Compile, Execute and Share your source code with the help of simple clicks. Compute Euclidean Distance and Convert Distance Vector to Matrix. You'll learn how to: Train tf. This is represented by a matrix of size n 3, where n is the number of points. Udemy is an online learning and teaching marketplace with over 100,000 courses and 24 million students. Kaldi Pytorch Kaldi Pytorch. viewer() allows interactive visualization of any point data that can be represented as a 3-column numpy array. frame¶ str – The frame of reference in which the image resides. @wkentaro for some minor changes. A point cloud is a set of points in space. Numpy Operations 4. In the following snippet of Python code, we access the intensity data of the point cloud using an numpy. Calculate the distance to a specific point within a dataset. MLS Interpolation Given the constraints, you can use interpolation to construct the implicit function. , copy/restore camera, walkthrough). If you wanted to rotate the point around something other than the origin, you need to first translate the whole system so that the point of rotation is at the origin. Abstract: Point set registration is a key component in many computer vision tasks. I've done a lot towards making point cloud and image display from python work properly, but I've run into some surprising behavior. VTK - The Visualization Toolkit any time!! They have extensive examples to start with. Ndarray is the n-dimensional array object defined in the numpy which stores the collection of the similar type of elements. I just wanted to know I could make use of the tilde operator (~) to invert a Numpy array consisting of boolean data. You can grab the dpeth frame off the device and create a corresponding blnder mesh. MR84 Installation Guide - Cisco Meraki. new project file 2. fields, cloud_msg. ndarray) – Point cloud data. It is written in Cython, and implements enough hard bits of the API (from Cythons perspective, i. A point cloud registration, method that I found particularly useful was the Coherent Point Drift (CPD) algorithm by Myronenko and Song. imshow(img) plt. These plugins can also be installed directly from the QGIS Plugin Manager within the QGIS application. Lin Weisi on a Research Grant of S$537,696 (AcRF-Tier 2). Convert the image to grayscale and plot its histogram. I've also tried adding numpy and numpy. You'll learn how to: Train tf. An example of how to generate a 3D structured points dataset using numpy arrays. Here are the examples of the python api numpy. ndarray instances using the Mesh. Notice that the point cloud dataset requires a large storage capacity (about 18. Download Anaconda. A particular focus of the rowan package is working with unit quaternions, which are a popular means of. misc import skimage. A triangulation of a compact surface is a finite collection of triangles that cover the surface in such a way that every point on the surface is in a triangle, and the intersection of any two triangles is either void, a common edge or a common vertex. Download the file for your platform. k-d trees are a special case of binary space partitioning trees. Lets say the points are in 2-d space, so each point can be represented with the triplet (x, y, v). rand(100, 3) chamfer_dist = pcu. I used cookiecutter to help with the packaging. Numpy is used to manipulate the data to be displayed in a 2D window OpenCV is used to display the manipulated depth data. target (numpy. However I wanted to try the octrees as an alternative data structure for other task like downsampling. Traditional method to classify the point cloud is rendering the whole data into 3D object, while this would cost a lot. I have over 20 million points in the form of (lat, lon, altitude) (or if you'd rather, (x, y, altitude)) and want to be able to interpolate to an arbitrary position. Plotly OEM Pricing Enterprise Pricing About Us Careers Resources Blog Support Community Support Documentation JOIN OUR MAILING LIST Sign up to stay in the loop with all things Plotly — from Dash Club to product updates, webinars, and more! Subscribe. frame¶ str – The frame of reference in which the image resides. It is not enough if you want to get to small details. Calculate the distance to a specific point within a dataset. Point_Placement. They are extracted from open source Python projects. Examples of such point clouds include data coming from stereo cameras or Time Of Flight cameras. bpf Read BPF files encoded as version 1, 2, or 3. The first two coordinates give the position in the projection plane, whereas the third one is used for assigning the color, just in the same way the coordinate z is used for the z-direction projection. ndarray) - Source point cloud data. Anaconda Cloud. NumPy is a module for Python. 23249; Members. [Windows, Mac OS] qSRA Surface of Revolution Analysis (comparison between a point cloud and a surface of revolution) [Windows] qCANUPO Point Cloud Classification with CANUPO [Windows, Mac OS, Linux] qM3C2 Computation of robust signed distances between point clouds. In the following, the data you want to write to a VTK file are supposed to arise from scipy/numpy arrays. Support Vector Machine Example Separating two point clouds is easy with a linear line, but what if they cannot be separated by a linear line? In that case we can use a kernel, a kernel is a function that a domain-expert provides to a machine learning algorithm (a kernel is not limited to an svm). Guibas, "PointNet: Deep Learning on Point Sets for 3D Classification and Segmentation," CoRR, vol. The libraries are matplotlib, wordcloud, numpy. I am using a Kinect Vision sensor. The Point Processing Tool Kit (PPTK) was developed by HERE to help Python developers visualize 3D point cloud data captured by sources such as LIDAR. According to the Adobe 3D PDF documentation, it looks like only point, line, and mesh geometries are supported. It also explains various Numpy operations with examples. Designed for high capacity and high density, the MR84 meets the needs of the most demanding environments, and also includes a cloud-managed third radio dedicated to optimizing the RF environment and securing the airwaves. In this example, we’ll work a bit backwards using a point cloud that that is available from our examples module. Must be the same shape as rows. Calculate unique connecting vectors for point cloud using numpy Published: Jan. The package has a 3-d point cloud viewer that directly takes a 3-column numpy array as input, and is able to interactively visualize 10-100 million points. k-d trees are a special case of binary space partitioning trees. is also normally distributed. A common problem in computer vision is the registration of 2D and 3D point sets [1, 4, 6, 7, 19, 26]. If you followed the instructions above, you will have an extra column named ‘z’ in the point cloud, you can delete it if you like. The code for compressed point cloud data was informed by looking at Matlab PCL. Iterating over Numpy arrays is non-idiomatic and quite slow. A point cloud registration, method that I found particularly useful was the Coherent Point Drift (CPD) algorithm by Myronenko and Song. 3d plane to point cloud fitting using SVD. bpf Read BPF files encoded as version 1, 2, or 3. How can I compute a normal vector from a cloud of points that theoretically are on a same surface? My issue is this: I have a cloud of 3D points from a CT scan. The above procedure can be repeated for other point clouds in Semantic 3D. Photogrammetry Aerial Shoots GIS GNSS Point Cloud Pix4Dで解析処理後のrayCloudのレイヤー「点群」のチェックで点群画像が表示されます． 点群のポイントサイズ変更は， 緻密化された点群 > プロパティを表示 > ポイントサイズ のスライド移動 によって可能です．. The following are code examples for showing how to use vtk. grid_cluster (x, size). Travis Oliphant created NumPy package in 2005 by injecting the features of the ancestor module Numeric into another module Numarray. a p-value of. GRASS, SAGA, QGIS Native, GDAL/OGR. やりたいこと Depthセンサで取得したデータをOpen3Dで自由自在に操りたい 教科書 Open3D: A Modern Library for 3D Data Processing — Open3D 0. View license def __compute_row_scores_for_submatrix(matrix, submatrix): """For a given matrix, compute the row scores. is also normally distributed. GitHub Gist: instantly share code, notes, and snippets. how to read generator data as numpy array. Scatter3d¶ Basic 3D Scatter Plot¶. I have over 20 million points in the form of (lat, lon, altitude) (or if you'd rather, (x, y, altitude)) and want to be able to interpolate to an arbitrary position. Tuple[numpy. Cython bindings of Point Cloud Library (PCL) Principles. I'd succesfully used the scipy's KDTree implementation for task like k-neighbors search and outlier filtering. Therefore, while designing an efficient system usually an object detection is run on every n th frame while the tracking algorithm is employed in the n-1 frames in between. You can think of each of the two columns as the time series of firing rates of one presynaptic neuron. Those two assumptions are the basis of the k-means model. graspable¶ GraspableObject3D – object to use to get contact information. import numpy as np. ←Home About CV Subscribe Least Squares Sphere Fit September 13, 2015. point to pointより計算速度が速い. A rotation matrix which creates a counterclockwise rotation of angle 'theta' about the origin in the 2-D plane can be created as follows:. The script saves the point cloud as a ply file for the next step. I'm trying to display an image that looks like this along with its point cloud: However, if I simply load the image up into a numpy array with uint8 values and send it over to V-REP I get this:. # encoding:utf-8import numpy as np. Guarda il profilo completo su LinkedIn e scopri i collegamenti di Paolo e le offerte di lavoro presso aziende simili. Note that the minimum-volume ellipsoid to contain a specified fraction of all points usually is not any of the possible solutions you have mentioned: it likely won't even be concentric with the point cloud and won't share any of its principal axes. There is a collection of plugins ready to be used, available to download. vector can be also passed here. We construct the point cloud by stacking shifted random numbers:. See Notes for common calling conventions. Plus, learn how to plot data and combine NumPy arrays with Python classes, and get examples of NumPy in action: solving linear equations, finding patterns, performing statistics, generating magic cubes, and more. In other words, we can define a ndarray as the collection of the data type (dtype) objects. 6), which has had all of the applicable packages installed, on Windows 10 with FME Workbench 2019. 5 Calculate X,Y and Z coordinates of point cloud "SL3DS5. When this method is used, surface values will only be interpolated for the input feature's vertices. I'm trying to produce a 3D point cloud from a depth image and some camera intrinsics. 00075 # "intensity" - laser or pixel intensity. At each step k (default 6) candidate points are generated and the one whose sum of squared distances to the other points is smallest is used. In this example, we'll work a bit backwards using a point cloud that that is available from our examples module. FLANN is Fast Library for Approximate Nearest Neighbors, which is a purportedly wicked fast nearest neighbor library for comparing multi-dimensional points. Cheers, Michael. Does anyone out there have any other ideas? I am using a custom python interpreter (I have tried both 3. Calculate the distance to a specific point within a dataset. Default is ‘tail’ normalize: [False | True] When True, all of the arrows will be the same length. If the GUI does not offer enough flexibility, you may always write your own Python code. We want the indexes of the white pixels to find the axes of the blob. This library wraps PCLPointCloud2 class into python and users can pass data from numpy to PointCloud easily with this library and headers. Load a PLY point cloud from disk. lstsq - coordinate translations. vtkhold(flag=True)¶. Duff on StackOverflow at this post. You can think of each of the two columns as the time series of firing rates of one presynaptic neuron. Skilled in Oracle Database,Oracle Cloud Integration Technologies(ODI,ICS), PHP, Oracle ADF, JAVA,PLSQl,Oracle Databases, and Oracle BI. I'd like to find the subset of points which are local maxima. I have a point cloud C, where each point has an associated value. ptp(a, axis=None, out=None) a : array containing numbers whose range is required axis : axis or axes along which the range is computed, default is to compute the range of the flattened array. PERCENTAGE —The tool will use the Percentage parameter to place points along the features by percentage. PCL（Point Cloud Library）作为一个优秀的点云库，目前已经开源并且集成了许多优秀学者提出的高效的算法，值得深入学习。本文将以Windows 10和Visual Studio 2017开发环境为例，讲解如何配置点云库。. It is based on SPDLib and RIOS and uses numpy. The fact-checkers, whose work is more and more important for those who prefer facts over lies, police the line between fact and falsehood on a day-to-day basis, and do a great job. Today, my small contribution is to pass along a very good overview that reflects on one of Trump’s favorite overarching falsehoods. Namely: Trump describes an America in which everything was going down the tubes under Obama, which is why we needed Trump to make America great again. And he claims that this project has come to fruition, with America setting records for prosperity under his leadership and guidance. “Obama bad; Trump good” is pretty much his analysis in all areas and measurement of U.S. activity, especially economically. Even if this were true, it would reflect poorly on Trump’s character, but it has the added problem of being false, a big lie made up of many small ones. Personally, I don’t assume that all economic measurements directly reflect the leadership of whoever occupies the Oval Office, nor am I smart enough to figure out what causes what in the economy. But the idea that presidents get the credit or the blame for the economy during their tenure is a political fact of life. Trump, in his adorable, immodest mendacity, not only claims credit for everything good that happens in the economy, but tells people, literally and specifically, that they have to vote for him even if they hate him, because without his guidance, their 401(k) accounts “will go down the tubes.” That would be offensive even if it were true, but it is utterly false. The stock market has been on a 10-year run of steady gains that began in 2009, the year Barack Obama was inaugurated. But why would anyone care about that? It’s only an unarguable, stubborn fact. Still, speaking of facts, there are so many measurements and indicators of how the economy is doing, that those not committed to an honest investigation can find evidence for whatever they want to believe. Trump and his most committed followers want to believe that everything was terrible under Barack Obama and great under Trump. That’s baloney. Anyone who believes that believes something false. And a series of charts and graphs published Monday in the Washington Post and explained by Economics Correspondent Heather Long provides the data that tells the tale. The details are complicated. Click through to the link above and you’ll learn much. But the overview is pretty simply this: The U.S. economy had a major meltdown in the last year of the George W. Bush presidency. Again, I’m not smart enough to know how much of this was Bush’s “fault.” But he had been in office for six years when the trouble started. So, if it’s ever reasonable to hold a president accountable for the performance of the economy, the timeline is bad for Bush. GDP growth went negative. Job growth fell sharply and then went negative. Median household income shrank. The Dow Jones Industrial Average dropped by more than 5,000 points! U.S. manufacturing output plunged, as did average home values, as did average hourly wages, as did measures of consumer confidence and most other indicators of economic health. (Backup for that is contained in the Post piece I linked to above.) Barack Obama inherited that mess of falling numbers, which continued during his first year in office, 2009, as he put in place policies designed to turn it around. By 2010, Obama’s second year, pretty much all of the negative numbers had turned positive. By the time Obama was up for reelection in 2012, all of them were headed in the right direction, which is certainly among the reasons voters gave him a second term by a solid (not landslide) margin. Basically, all of those good numbers continued throughout the second Obama term. The U.S. GDP, probably the single best measure of how the economy is doing, grew by 2.9 percent in 2015, which was Obama’s seventh year in office and was the best GDP growth number since before the crash of the late Bush years. GDP growth slowed to 1.6 percent in 2016, which may have been among the indicators that supported Trump’s campaign-year argument that everything was going to hell and only he could fix it. During the first year of Trump, GDP growth grew to 2.4 percent, which is decent but not great and anyway, a reasonable person would acknowledge that — to the degree that economic performance is to the credit or blame of the president — the performance in the first year of a new president is a mixture of the old and new policies. In Trump’s second year, 2018, the GDP grew 2.9 percent, equaling Obama’s best year, and so far in 2019, the growth rate has fallen to 2.1 percent, a mediocre number and a decline for which Trump presumably accepts no responsibility and blames either Nancy Pelosi, Ilhan Omar or, if he can swing it, Barack Obama. I suppose it’s natural for a president to want to take credit for everything good that happens on his (or someday her) watch, but not the blame for anything bad. Trump is more blatant about this than most. If we judge by his bad but remarkably steady approval ratings (today, according to the average maintained by 538.com, it’s 41.9 approval/ 53.7 disapproval) the pretty-good economy is not winning him new supporters, nor is his constant exaggeration of his accomplishments costing him many old ones). I already offered it above, but the full Washington Post workup of these numbers, and commentary/explanation by economics correspondent Heather Long, are here. On a related matter, if you care about what used to be called fiscal conservatism, which is the belief that federal debt and deficit matter, here’s a New York Times analysis, based on Congressional Budget Office data, suggesting that the annual budget deficit (that’s the amount the government borrows every year reflecting that amount by which federal spending exceeds revenues) which fell steadily during the Obama years, from a peak of $1.4 trillion at the beginning of the Obama administration, to $585 billion in 2016 (Obama’s last year in office), will be back up to $960 billion this fiscal year, and back over $1 trillion in 2020. (Here’s the New York Times piece detailing those numbers.) Trump is currently floating various tax cuts for the rich and the poor that will presumably worsen those projections, if passed. As the Times piece reported: