Categories
discuss

What is a Java 8 “view”?

I’m watching a talk by Paul Philips :

http://www.youtube.com/watch?v=TS1lpKBMkgg

at 12:48 he says “in Java 8 their views actually work” when comparing Scala and Java

What are Java “views” and what is Scala’s equivalent ?

update : Thanks to Daniel’s answer I found this article helpful : http://www.scala-lang.org/docu/files/collections-api/collections_42.html

Answer

Java 8’s Stream is what he means by views. They have two important properties:

  1. They are non-strict, which means they only produce the result on-demand.
  2. They “fuse” together multiple operations, so you can do multiple map or filter calls, and the original collection will still be iterated only once.

Scala’s equivalent are the various View collections, which you can get by calling .view on an existing collection. They do have these properties — they are the defining properties, after all — but are plagued with deficiencies and bugs, not to mention a very complex implementation.

Paul has toyed with alternative implementations for it on and off, but it has never been a priority replacing them.

Categories
discuss

Does Android KitKat allows devices that support Bluetooth LE to act as a peripheral device?

Till Android 4.3, an Android device with Bluetooth Low Energy support could only act as a central device. As is given in the Android 4.3 Bluetooth LE docs:

Android 4.3 (API Level 18) introduces built-in platform support for Bluetooth Low Energy in the central role and provides APIs that apps can use to discover devices, query for services, and read/write characteristics.

With the introduction of Android 4.4 KitKat, can it also behave as a peripheral device now? I couldn’t find any documentation of that. If not, then do we have to wait for Android 4.5 for getting the BTLE peripheral mode on Android device? Or is there any alternative way in which an Android device can be made to act as a peripheral device?

Answer

Thanks everyone for the answers. Just to update, as of June 2014, it is offered in the Android L Developer preview. Hope to see it in the Android L official release. From their blog New in Android: L Developer Preview and Google Play Services 5.0:

The L Developer Preview, lets you explore many of the new features and capabilities of the next version of Android, and offers everything you need to get started developing and testing on the new platform. Here are a few of the highlights for developers:

BLE Peripheral Mode — Android devices can now function in Bluetooth Low Energy (BLE) peripheral mode. Apps can use this capability to broadcast their presence to nearby devices — for example, you can now build apps that let a device to function as a pedometer or health monitor and transmit data to another BLE device.

Update:

The bad news is that BLE Peripheral mode will only work on the newer Android devices (as per the date), viz Nexus 6 and Nexus 9. It won’t work on Nexus 4/ Nexus 5/nexus 7/ nexus 10 even if you update it to Android 5.0. You can read the comment by one of the Android Project manager on BLE advertise mode not working ticket. He says:

We introduced BLE peripheral mode in Android 5.0 Lollipop. Nexus 6 and Nexus 9 are the first two production Nexus devices that support BLE peripheral mode. Due to hardware chipset dependency, older Nexus devices (4/5/7) will not have access to the feature on Lollipop.

Categories
discuss

Chrome Extension referencing/calling other script functions from a content script

I have a content script with a lot of functions in it, i would like to be able to split out those functions into other scripts

Is there any magic needed to call other scripts from the content script

my manifest contains both scripts

"content_scripts": [
  {
    "matches": [
      "*://*/*"
    ],
    "js": [
      "content.js",
      "other.js"
    ]
  }
]

my content script is working fine

however if i put a function in the other.js file and step through it, anything i reference in other.js is undefined

is there anything i should know here?

Edit:

This is just a simple example, the Test function should run on contentscript load

contentscript.js

Test();

other.js;

function Test(){
  return true;
}

Google is telling me uncaught ReferenceError, Test not defined

Answer

Just to add a little more for anyone looking for an answer regarding the other scripts, as well as other methods for extension script access.

You can access the rest of the extension’s scripts using the chrome.extension methods, as well as the chrome.runtime communication methods.

  1. To get an array of all of the scripts from an extension, you can use the extension.getViews method.

  2. You can also grab the background script, or a specific background script with the getBackgroundPage method.

  3. Another option is to use message passing to pass the contents of a script with the runtime.sendMessage method, and using an event listener on another script to listen for runtime.onMessage allowing the script to receive the data from the sending script.

  4. In addition to the previous option, you can use also use message passing to receive scripts from another active extension sending with runtime.sendMessage, but this time using an event listener with the runtime.onMessageExternal instead (Cannot be used in Content Scripts).

I hope this helps someone, as much as it would have helped me earlier on.

Categories
discuss

Javascript get element at point outside viewport

Is there something similar to document.elementFromPoint(x,y) that works for elements that are outside the viewport?

According to the MDN docs for document.elementFromPoint() (https://developer.mozilla.org/en-US/docs/DOM/document.elementFromPoint)

If the specified point is outside the visible bounds of the document or either coordinate is negative, the result is null.

So obviously it doesn’t work if you’re trying to grab elements beyond the user’s viewport.

Thanks!

Answer

Hey I had the same issue, where if the element is not within the current bounds of the viewport elementFromPoint will return null.

I find that you have to scroll to the element location to make it visible in the viewport and then perform the elementFromPoint.

(function() {
  'use strict';
  var api;
  api = function(x,y) {
    var elm, scrollX, scrollY, newX, newY;
    /* stash current Window Scroll */
    scrollX = window.pageXOffset;
    scrollY = window.pageYOffset;
    /* scroll to element */
    window.scrollTo(x,y);
    /* calculate new relative element coordinates */
    newX = x - window.pageXOffset;
    newY = y - window.pageYOffset;
    /* grab the element */
    elm = this.elementFromPoint(newX,newY);
    /* revert to the previous scroll location */
    window.scrollTo(scrollX,scrollY);
    /* returned the grabbed element at the absolute coordinates */
    return elm;
  };
  this.document.elementFromAbsolutePoint = api;
}).call(this);

You can simply use this command whenever the coordinates are absolute from the pageX,pageY.

document.elementFromAbsolutePoint(2084, 1536);

This code is also on GitHub packaged into a bower component for ease of including into projects.

https://github.com/kylewelsby/element-from-absolute-point

Hope this helps your project.

Categories
discuss

decodeAudioData HTML5 Audio API

I want to play audio data from an ArrayBuffer… so I generate my array and fill it with microfone input. If I draw this data on a canvas it looks like –> enter image description here

So this works!

But if i want to listen to this data with

context.decodeAudioData(tmp, function(bufferN) { //tmp is a arrayBuffer
    var out = context.createBufferSource();
    out.buffer = bufferN;
    out.connect(context.destination);
    out.noteOn(0);
}, errorFunction);

I dont hear anything… because the errorFunction is called. But error is null!

I also tryed to get the buffer like that:

var soundBuffer = context.createBuffer(myArrayBuffer, true/*make mono*/);

But i get the error: Uncaught SyntaxError: An invalid or illegal string was specified.

anybody who can give me a hint ?

EDIT 1 (More code and how I get the mic input):

 navigator.webkitGetUserMedia({audio: true}, function(stream) {

                liveSource = context.createMediaStreamSource(stream);

                // create a ScriptProcessorNode
                if(!context.createScriptProcessor){
                   node = context.createJavaScriptNode(2048, 1, 1);
                } else {
                   node = context.createScriptProcessor(2048, 1, 1);
                }


                node.onaudioprocess = function(e){

               var tmp = new Uint8Array(e.inputBuffer.byteLength);
               tmp.set(new      Uint8Array(e.inputBuffer.byteLength), 0);

   //Here comes the code from above.

Thanks for your help!

Answer

The returned error from the callback function is null because in the current webaudio api spec that function does not return an object error

callback DecodeSuccessCallback = void (AudioBuffer decodedData);
callback DecodeErrorCallback = void ();

    void decodeAudioData(ArrayBuffer audioData,
                         DecodeSuccessCallback successCallback,
                         optional DecodeErrorCallback errorCallback);

DecodeSuccessCallback is raised when the complete input ArrayBuffer is decoded and stored internally as an AudioBuffer but for some unknown reason decodeAudioData can not decode a live stream.

You can try to play the captured buffer setting the output buffer data when processing audio

function connectAudioInToSpeakers(){

  //var context = new webkitAudioContext();  
  navigator.webkitGetUserMedia({audio: true}, function(stream) {

    var context = new webkitAudioContext();  
    liveSource = context.createMediaStreamSource(stream);

    // create a ScriptProcessorNode
    if(!context.createScriptProcessor){
       node = context.createJavaScriptNode(2048, 1, 1);
    } else {
       node = context.createScriptProcessor(2048, 1, 1);
    }


    node.onaudioprocess = function(e){

        try{
            ctx.clearRect(0, 0, document.getElementById("myCanvas").width, document.getElementById("myCanvas").height);
            document.getElementById("myCanvas").width = document.getElementById("myCanvas").width;
            ctx.fillStyle="#FF0000";

            var input = e.inputBuffer.getChannelData(0);
            var output = e.outputBuffer.getChannelData(0);
            for(var i in input) {
                output[i] = input[i];
                ctx.fillRect(i/4,input[i]*500+200,1,1);
            }


        }catch (e){
            console.log('node.onaudioprocess',e.message);
        }

    }

     // connect the ScriptProcessorNode with the input audio
    liveSource.connect(node);
    // if the ScriptProcessorNode is not connected to an output the "onaudioprocess" event is not triggered in chrome
    node.connect(context.destination);

    //Geb mic eingang auf boxen
    //liveSource.connect(context.destination);
  });
}
Source: stackoverflow
Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this site, you agree to the Privacy Policy, and Copyright Policy. Content is available under CC BY-SA 3.0 unless otherwise noted. The answers/resolutions are collected from stackoverflow, are licensed under cc by-sa 2.5 , cc by-sa 3.0 and cc by-sa 4.0 © No Copyrights, All Questions are retrived from public domain..