Home PHP C# C++ Android Java Javascript Python IOS SQL HTML Categories

How to transform two set of discrete points ( vectors ) to help plotting them on a common scale

I feel you need to upsample / interpolate the vector with fewer samples to get more samples and downsample / decimate the vector with higher samples to get fewer samples ( In essence matching the sampling rate of both the vectors ).

I used scipy.signal.resample to do the up / down sampling.

I tried to simulate your situation using two random vectors of unequal sample sizes.

See if this helps you out :

import numpy as np
from scipy import signal  
# scipy.signal module contains a interpolator /
import matplotlib.pyplot as plt

# Creating random vectors for a and b

vector_a = np.sin(2*3.14*100*np.arange(130))  
# Sine signal with 100Hz freq and 130 time samples

vector_b = np.cos(2*3.14*100*np.arange(80))
# Cosine signal with 100Hz freq and 80 time

# To avoid bias towards any one vector length take
# mean of the two sample lengths as the common
sample length

common_no_of_samples = (vector_a.shape[0] +
vector_b.shape[0]) // 2 
# 105 Samples

# Upsample vector_a to have common_no_of_samples
vector_a = signal.resample(vector_a,
# Downsample vector_b to have common_no_of_samples
vector_b = signal.resample(vector_b,

fig, ax = plt.subplots()
ax.plot(np.arange(common_no_of_samples), vector_a,
ax.plot(np.arange(common_no_of_samples), vector_b,

# Where np.arange(common_no_of_samples) refers to
the common time axis
# vector_a and vector_b are the resampled vectors.

enter image description here

If you want as points in enter image description here you could do :

time_axis =
vector_a = np.dstack((vector_a, time_axis))

This will generate points of the form :

array([[[  2.23656191e-02,  
    [ -3.96584073e-01,   1.00000000e+00],
    [ -7.01262520e-01,   2.00000000e+00],
    [ -9.31867589e-01,   3.00000000e+00],
    [ -9.95165113e-01,   4.00000000e+00],
    [ -9.24625413e-01,   5.00000000e+00],
    [ -6.96587056e-01,   6.00000000e+00],
    [ -3.74795767e-01,   7.00000000e+00],
    [  1.59956385e-02,   8.00000000e+00],
    [  3.94192306e-01,   9.00000000e+00],
    [  7.20969109e-01,   1.00000000e+01],
    [  9.28803144e-01,   1.10000000e+01],
    [  1.00160878e+00,   1.20000000e+01],
    [  9.13659002e-01,   1.30000000e+01],
    [  6.91934367e-01,   1.40000000e+01],
    [  3.57910455e-01,   1.50000000e+01],

Categories : Algorithm

Related to : How to transform two set of discrete points ( vectors ) to help plotting them on a common scale
plotting 3d vectors using matplot lib
You need to use Axes3D from mplot3d in mpl_toolkits, then set the subplot projection to 3d: import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D import numpy as np soa =np.array( [ [0,0,1,1,-2,0], [0,0,2,1,1,0],[0,0,3,2,1,0],[0,0,4,0.5,0.7,0]]) X,Y,Z,U,V,W = zip(*soa) fig = plt.figure() ax = fig.add_subplot(111, projection='3d') ax.quiver(X,Y,Z,U,V,W) ax.set_xlim([-1,0.5

Categories : Python
-webkit-transform: scale / blurry images
it works if you reset the blur filter in safari: filter: none; -webkit-filter: blur(0px); -moz-filter: blur(0px); -ms-filter: blur(0px); filter:progid:DXImageTransform.Microsoft.Blur(PixelRadius='0'); hope it helps

Categories : CSS
SQL query to return most common, 2nd most common, 3rd most common occurrences
You can just use GROUP_CONCAT() aggregate function: SELECT Fruit, GROUP_CONCAT(Freshness ORDER BY cnt DESC) as Common FROM ( SELECT Fruit, Freshness, COUNT(*) cnt FROM fruits GROUP BY Fruit, Freshness ) s that will return values like: Fruit | Common --------------------- Banana | New,Old,Ripe Cherry | New,Ripe ... | ... but if you want to divide the result in three column

Categories : Mysql
The values of the "num" and "den" properties must be row vectors or cell arrays of row vectors, where each vector is nonempty
If all you are wanting to do is plot that function, then something along the lines of this should suffice: s = -10:.5:10; G = 1 .* (exp(1)-0.5.*s) ./ (s+1) .* (s+5).^2; plot(s,G)

Categories : Matlab
Density scale of Y axis mysteriously changed when X scale is transformed to log10
The density should integrate to 1, which still allows the point estimate of the density as some point to exceed 1. However, the reason that the general shape of the density is changing is that scale_x_log10() and coord_trans(x="log10") do different things. In particular, the scale transformation (scale_x_log10()) happens before any statistics (such as density) are computed. So the density that is

Categories : R
Recently Add
Proving optimality for a new algorithm that finds minimum spanning tree
why this assembly piece of code do jmp forever
Find out if segment is fully inside of polygon
Algorithm for coloring a hexagon tile map with minimum distance (3) for reoccurring colors
Sort pairs to be more consecutive
To find three unique numbers whose number of occurrence is even
Dealing with duplication between unit and integration tests
reflection and symmetry in back tracking queens
Big O analysis for method with multiple parameters
Divide Huge Array of Numbers in Buckets
Algorithm to find adjacent cells in a matrix
Why this code gives WA for Petersen Graph(codechef)?
Complexity of this prime number search algorithm
How to detect if a file has changed?
Given string x,y and z. Determine if z is a shuffle
Basic decryption for simple encryption algorithm
An efficient way to assign user_ids to huge dataset under certain conditions
What's a more efficient implementation of this puzzle?
Generating prime numbers in poly-time
What if I do not use G transpose in calculating Strongly Connected Components?
Dividing an array into optimum no of equal sum sublists
Counting derangements
How to iterate through all cases when partitioning objects
Algorithm: How to find closest element, having coordinates and dimension
Developing player rankings with ELO
How to transform two set of discrete points ( vectors ) to help plotting them on a common scale
Heap Sort Space Complexity
complex root finding algorithm
Every possible combination algorithm
RSA Cryptosystem - Retrieve m
© Copyright 2017 Publishing Limited. All rights reserved.