Questions tagged [inference]

Inference is the act or process of deriving logical conclusions from premises known or assumed to be true. The conclusion drawn is also called an idiomatic. The laws of valid inference are studied in the field of logic.

Inference is the act or process of deriving logical conclusions from premises known or assumed to be true. The conclusion drawn is also called an idiomatic. The laws of valid inference are studied in the field of logic.

Or inference can be defined in another way. Inference is the non-logical, but rational, means, through observation of patterns of facts, to indirectly see new meanings and contexts for understanding. Of particular use to this application of inference are anomalies and symbols. Inference, in this sense, does not draw conclusions but opens new paths for inquiry. In this definition of inference, there are two types of inference: inductive inference and deductive inference. Unlike the definition of inference in the first paragraph above, meaning of word meanings are not tested but meaningful relationships are articulated.

Human inference (i.e. how humans draw conclusions) is traditionally studied within the field of cognitive psychology; researchers develop automated inference systems to emulate human inference. Statistical inference allows for inference from quantitative data.

Source:http://en.wikipedia.org/wiki/Inference

613 questions
5
votes
1 answer

TFLite Inference on video input

I have an SSD tflite detection model that I am running with Python on a desktop computer. As for now, my script below takes a single image as an input for inference and it works fine: # Load TFLite model and allocate tensors. interpreter =…
Sayaki
  • 769
  • 14
  • 36
5
votes
0 answers

Inference time and TFLOPS in pytorch

I am currently looking into the half-precision inference time of different CNN models using the torch.autograd.profiler using two different GPUs: Nvidia RTX 2080 Ti (26.90 TFLOPS) - done locally (better CPU) Nvidia T4 (65.13 TFLOPS) - done in the…
Tuxa
  • 71
  • 2
5
votes
2 answers

Typescript infer function parameters in derived class

I noticed that when implementing a generic interface (or class) and explicitly stating the types of those generics, the parameter types for functions inside the subclass are not inferred. interface MyInterface { open(data: T): void } class…
Viktor W
  • 1,139
  • 6
  • 9
5
votes
1 answer

Variational Autoencoder on Timeseries with LSTM in Keras

I am working on a Variational Autoencoder (VAE) to detect anomalies in time series. So far I worked with this tut https://blog.keras.io/building-autoencoders-in-keras.html and this…
Aaron2Dev
  • 61
  • 2
  • 3
5
votes
1 answer

tensorflow c++ batch inference

I have a problem with making inference on a batchsize greater than 1 using the c++ tensorflow api. The network input planes are 8x8x13 and the output is a single float. When I try to infer on multiple samples as follows, the result is correct only…
danny
  • 1,101
  • 1
  • 12
  • 34
5
votes
1 answer

How to Batch Multiple Videoframes before run Tensorflow Inference Session

I made a project that basically uses googles object detection api with tensorflow. All i am doing is inference with a pre-trained model: Which means realtime object detection where the Input is the Videostream of a webcam or something similar using…
gustavz
  • 2,964
  • 3
  • 25
  • 47
5
votes
6 answers

Why doesn't C# do "simple" type inference on generics?

Just curious: sure, we all know that the general case of type inference for generics is undecidable. And so C# won't do any kind of sub-typing at all: if Foo is generic, Foo isn't a subtype of Foo, or Foo or of anything else you…
Ken Birman
  • 1,088
  • 8
  • 25
5
votes
1 answer

PyMC observed data for a sum of random variables

I'm trying to infer models parameters with PyMC. In particular the observed data is modeled as a sum of two different random variables: a negative binomial and a poisson. In PyMC, an algebraic composition of random variables is described by a…
user2304916
  • 7,882
  • 5
  • 39
  • 53
5
votes
3 answers

Is there any free owl reasoner that can reason without load all data into memory?

I use Jena and TDB to store RDF,and I want to do some inference on it.But the RDF data is big,and Jena's owl reasoner has to load all the data into memory . So I want to find one reasoner that can reason without load all data into memory,is there…
Wang Ruiqi
  • 804
  • 6
  • 19
4
votes
3 answers

A reverse inference engine (find a random X for which foo(X) is true)

I am aware that languages like Prolog allow you to write things like the following: mortal(X) :- man(X). % All men are mortal man(socrates). % Socrates is a man ?- mortal(socrates). % Is Socrates mortal? yes What I want is something…
Kef Schecter
  • 212
  • 2
  • 9
4
votes
2 answers

Transformers: How to use CUDA for inferencing?

I have fine-tuned my models with GPU but inferencing process is very slow, I think this is because inferencing uses CPU by default. Here is my inferencing code: txt = "This was nice place" model =…
Mr. Engineer
  • 215
  • 1
  • 9
  • 26
4
votes
2 answers

Search control in Haskell

Suppose you're writing a program that searches an exponentially large or infinite space: gameplaying, theorem proving, optimization etc, anything where you can't search the entire space, and the quality of results depends heavily on choosing which…
rwallace
  • 31,405
  • 40
  • 123
  • 242
4
votes
1 answer

PyTorch: Inference on a single very large image using multiple GPUs?

I want to perform inference (i.e. semantic segmentation) on a very large satellite image without splitting it into pieces. I have access to 4 GPUs (each having 15 GBs of memory) and was wondering if it is possible to somehow use all the memory of…
Federico
  • 266
  • 1
  • 2
  • 13
4
votes
1 answer

Tensorflow Lite inference - how do I scale down the convolution layer outputs?

I built a simple CNN model with one convolutional layer and converted it with Tensorflow Lite. (for MNIST!!) So now my model gets 8-bit integer inputs and weights are 8-bit integers too. I wanted to test the parameters I got from TFLite, so I wrote…
4
votes
0 answers

Invalid GraphDef in Tensorflow 2.2.0 saved model while using TF_GraphImportGraphDef in c_api

I saved a keras model in tf 2.2.0 in Python via: model.save('model', save_format='tf') And it gives me a saved_model.pb in the "model" directory. I want to make inference via c_api, and the following code using the function:…
Weishuo
  • 41
  • 3