Author: 56zliqinenwq

  • boat

    Boat

    This project is a single-page landing site dedicated to promoting boat trips, yachting, and rental services. The site is designed to provide a visually appealing and informative experience for users interested in maritime adventures. Below is a detailed description of each section of the website.

    Table of content

    Routing

    The app uses react-scroll for navigation. The available sections are:

    • Home – Welcome section includes:

    Navigation: A responsive navigation menu that adapts seamlessly to mobile devices, ensuring easy access to all sections of the site. Main Banner: Features an eye-catching image of a motorboat riding the waves of the ocean, setting the tone for the adventurous theme of the site.

    • Speedy – includes:

    Yacht Display: Showcases images of various yachts, highlighting different options available for rent. Animated Card: An engaging, animated card providing detailed information and options for renting a vessel, making the user experience interactive and dynamic.

    • Safety – includes:

    Importance of Safety: This section emphasizes the critical importance of safety while boating, ensuring users understand the measures taken to protect them during their maritime adventures.

    • Gallery – includes:

    Photo Slider: A gallery presented in a slider format, displaying beautiful and captivating images of boats, yachts, and ocean scenes to inspire and attract potential customers.

    • Contact – includes:

    Contact Form: A simple and effective form for users to get in touch, ask questions, or request further information about the services offered.

    Features

    Implemented responsiveness and mobile menu for better UI.

    Getting Started

    To run the application locally, follow these steps:

    1. Clone the repository: git clone https://github.com/Inna-Mykytiuk/boat.git
    2. Install dependencies: npm install
    3. Run the app: npm run dev
    4. Open http://localhost:3000 in your browser (Note: the port may be changed if 3000 port is occupied by another website).

    Technologies Used

    1. Next.js: Our website is powered by Next.js, providing a seamless and efficient user experience. It ensures fast loading times and smooth navigation.

    2. Tailwind CSS: Used for styling, offering a highly customizable and utility-first approach to CSS that ensures responsive and attractive design.

    3. React-Scroll: Enhancing the scrolling experience on our website, React-Scroll adds a touch of sophistication, allowing users to glide smoothly between sections.

    4. TypeScript: Implemented for static type checking, enhancing code quality and reducing errors during development.

    5. Framer Motion: Our project incorporates Framer Motion to bring life to the user interface with stunning animations. Framer Motion is a React library that simplifies the creation of smooth and interactive animations, adding a layer of dynamism to various elements on our website.

    6. Swiper: Used for creating the responsive, touch-friendly photo slider in the Gallery section, providing a seamless viewing experience on all devices..

    Summary

    This landing page effectively combines visual appeal with functional design to create an engaging user experience for individuals interested in boat trips and yacht rentals. By leveraging modern web development technologies, the site ensures high performance, responsiveness, and an interactive user interface. The strategic use of animations, smooth scrolling, and an intuitive navigation system further enhances the overall experience, making it easy for users to explore the services offered and get in touch for more information.

    preview preview

    Visit original content creator repository https://github.com/Inna-Mykytiuk/boat
  • COSCO

    COSCO Framework

    COSCO is an AI based coupled-simulation and container orchestration framework for integrated Edge, Fog and Cloud Computing Environments. It’s a simple python based software solution, where academics or practitioners can develop, simulate, test and deploy their scheduling policies. Further, this repo presents a novel gradient-based optimization strategy using deep neural networks as surrogate functions and co-simulations to facilitate decision making. A tutorial of the COSCO framework was presented at the International Conference of Performance Engineering (ICPE) 2022. Recording available here.

    Advantages of COSCO

    1. Hassle free development of AI based scheduling algorithms in integrated edge, fog and cloud infrastructures.
    2. Provides seamless integration of scheduling policies with simulated back-end for enhanced decision making.
    3. Supports container migration physical deployments (not supported by other frameworks) using CRIU utility.
    4. Multiple deployment support as per needs of the developers. (Vagrant VM testbed, VLAN Fog environment, Cloud based deployment using Azure/AWS/OpenStack)
    5. Equipped with a smart real-time graph generation of utilization metrics using InfluxDB and Grafana.
    6. Real time metrics monitoring, logging and consolidated graph generation using custom Stats logger.

    The basic architecture of COSCO has two main packages:
    Simulator: It’s a discrete event simulator and runs in a standalone system.
    Framework: It’s a kind of tool to test the scheduling algorithms in a physical(real time) fog/cloud environment with real world applications.

    Supported workloads: (Simulator) Bitbrains and Azure2017/2019; (Framework) DeFog and AIoTBench.

    Our main COSCO work uses the Bitbrains and DeFog workloads. An extended work, MCDS (see workflow branch), accepted in IEEE TPDS uses scientific workflows. Check paper and code.

    Novel Scheduling Algorithms

    We present two novel algorithms in this work: GOBI and GOBI*. GOBI uses a neural network as a surrogate model and gradient based optimization using backpropagation of gradients to input. With advances like cosine annealing and momentum allow us to converge to an optima quickly. Moreover, GOBI* leverages a coupled simulation engine like a digital-twin to further improve the surrogate accuracy and subsequently the scheduling decisions. Experiments conducted using real-world data on fog applications using the GOBI and GOBI* methods, show a significant improvement in terms of energy consumption, response time, Service Level Objective and scheduling time by up to 15, 40, 4, and 82 percent respectively when compared to the state-of-the-art algorithms.

    Supplementary video

    IMAGE ALT TEXT HERE

    A detailed course on using the COSCO framework for deep learning based scheduling (deep surrogate optimization and co-simulation) in fog environments is available as a youtube playlist.

    Quick Start Guide

    To run the COSCO framework, install required packages using

    python3 install.py

    To run the code with the required scheduler, modify line 106 of main.py to one of the several options including LRMMTR, RF, RL, RM, Random, RLRMMTR, TMCR, TMMR, TMMTR, GA, GOBI.

    scheduler = GOBIScheduler('energy_latency_'+str(HOSTS))

    To run the simulator, use the following command

    python3 main.py

    Gitpod

    You can directly run tests on the results using a Gitpod Workspace without needing to install anything on your local machine. Click “Open in Gitpod” below and test the code by running python3 main.py.

    Open in Gitpod

    Wiki

    Access the wiki for detailed installation instructions, implementing a custom scheduler and replication of results. All execution traces and training data is available at Zenodo under CC License.

    Links

    Items Contents
    Paper https://ieeexplore.ieee.org/document/9448450 (with the “Code Reviewed Badge”)
    Pre-print https://arxiv.org/pdf/2104.14392.pdf
    Documentation https://github.com/imperial-qore/COSCO/wiki
    Video https://youtu.be/RZOWTj0rfBQ
    Tutorial https://www.youtube.com/playlist?list=PLN_nzHzuaOBQijEwy2Fy8c09-dWYVe4XO
    ICPE Tutorial https://youtu.be/osjpaNmkm_w
    Extensions QoS aware scheduling (TPDS’22, code), Energy aware sustainable computing (JSS’21, code), EdgeAI (SIGMETRICS’21 Poster, NeurIPS’21 Workshop) and fault-tolerance (INFOCOM’22, code)
    Contact Shreshth Tuli (@shreshthtuli)
    Funding Imperial President’s scholarship, H2020-825040 (RADON)

    Cite this work

    Our work is published in IEEE TPDS journal. Cite using the following bibtex entry.

    @article{tuli2021cosco,
      author={Tuli, Shreshth and Poojara, Shivananda R. and Srirama, Satish N. and Casale, Giuliano and Jennings, Nicholas R.},
      journal={IEEE Transactions on Parallel and Distributed Systems}, 
      title={{COSCO: Container Orchestration Using Co-Simulation and Gradient Based Optimization for Fog Computing Environments}}, 
      year={2022},
      volume={33},
      number={1},
      pages={101-116},
    }

    License

    BSD-3-Clause. Copyright (c) 2021, Shreshth Tuli. All rights reserved.

    See License file for more details.

    Visit original content creator repository https://github.com/imperial-qore/COSCO
  • argocd-dex

    argocd-dex

    ArgoCD with Dex Configuration

    Install ArgoCD

    • To Install argoCD. Update the version if required.
          kubectl apply -k argocd-install/
    • If you are using public cloud update svc for argocd-server as load balancer”
      kubectl patch svc argocd-server -n argocd -p '{"spec": {"type": "LoadBalancer"}}'
    • Get Initial Secret :
      kubectl -n argocd get secret argocd-initial-admin-secret -o jsonpath="{.data.password}" | base64 -d; echo
      
    • Login to ArgoCD server to upate password
      argocd login <ARGOCD_SERVER> # Just  the IP:PORT
      argocd account update-password

    ArgoCD Dex Integration with Microsoft connector

    Prerequsites

    • Create new application in AzureAD OIDC follow following link for the same Quick Start : Register an application
    • Call back URL in your AzureAD application would be :
      • https://<ArgoCD_Server_IP/URL>/api/dex/callback
    • I worked with GKE so still needs to work with localhost. It should work for localhost too where the IP/URL is required.

    Lets run some commands now to get it work

    Microsoft Connector

    • Following file needs to be updated:

      • microsoft-connector/argocd-extra.yaml

        • <Your Base64 Client Secret> : with your ClientSecret created in AzureAD for OIDC application
        • <Your Clinet/ApplicationID of Azure app> : Client/Application ID in AuzreAD OIDC Application.
        • <ArgoCD_Server_IP/URL> : If you port forward your application to localhost it should be localhost If you create a nodeport type service it should be localhost:nodeport If you create a loadbalancer it will be your loadbalancerIP
    • Lets first apply the file with configmap and secret changes

      kubectl apply -f microsoft-connector/argocd-extra.yaml -n argo 
      
    • Lets install argocd now

      kubectl apply -f microsoft-connector/argocd-install.yaml -n argo 
      
    • Now port-forward your server to localhost or use the loadbalancer IP and you should see following screen:

    • URL should be :

      • https://localhost OR https://<LoadBalancerIP>
      • Click on AZURETEST and it should authenticate your with AzureAD

      Login Page

    Visit original content creator repository https://github.com/tiwarisanjay/argocd-dex
  • airline-consortium

    Steps to run the project:

    • Download the latest binaries for Node.js and MongoDB.
    • Create a database in MongoDB and call it ask_db.
    • Add a database owner account using the following snippet:
    db.createUser({
      user: "ask_admin",
      pwd: "your password here",
      roles: [{ role: "dbOwner", db: "ask_db" }],
      passwordDigestor: "server"
    });
    • Install truffle using npm install -g truffle and install Ganache binary or CLI.

    • Start the ganache process using CLI or open Ganache GUI, this will start the ganache private network process.

    • Open a terminal and cd into the truffle-build directory and type truffle compile to compile the smart contract. Then type truffle migrate to deploy the smart contract on ganache.

    • Create a .env file in the root of the project. Add the following variables in the .env file:

      DB_URL="mongodb://<username>:<password>@localhost:27017/<db_name>"
      CONTRACT_ADDRESS="<contract address>"
      BC_HOST_URL="http://localhost:7545"
    • Go to the terminal cd into the project root and type npm install.

    • Once all the dependencies are installed type npm run dev-test-run

    • Open the browser and type http://localhost:8000

    Screens:

    Signup Page:

    Passenger Signup:

    Airline Signup:

    Login Page:

    User Landing Page:

    Airline Landing Page:

    Passenger Landing Page:

    Passenger Purchases Page:

    Airline pending requests page:

    Airline sending request to other airline:

    Airline sending response to passenger:

    Transactions page:

    Visit original content creator repository https://github.com/socket-var/airline-consortium
  • messenger-sdk-ios

    Deskpro

    messenger-sdk-ios

    Messenger SDK iOS OS Messenger SDK iOS LANGUAGES Messenger SDK iOS SPM Messenger SDK iOS CI

    DeskPro iOS Messenger is a Chat/AI/Messaging product. You can embed a “widget” directly into native app, so that enables end-users to use the product. Similar implementation for Android.

    Requirements

    • iOS 11.0+
    • Swift 5.7+
    • Xcode 14.0+

    Installation

    • File > Swift Packages > Add Package Dependency
    • Add https://github.com/deskpro/messenger-sdk-ios
    • Select “Up to Next Major” version

    Manual installation

    Although we recommend using SPM, it is also possible to clone this repository manually, and drag and drop it into the root folder of the application.

    Initialization (Swift)

    First, import the SDK:

    import messenger_sdk_ios
    

    Then, in your ViewController:

    let messengerConfig = MessengerConfig(appUrl: "YOUR_APP_URL", appId: "YOUR_APP_ID")
    var messenger: DeskPro?
    

    Replace YOUR_APP_URL and YOUR_APP_ID with your app’s URL and ID.

    override func viewDidLoad() {
        super.viewDidLoad()    
        messenger = DeskPro(messengerConfig: messengerConfig, containingViewController: self)
    }
    

    To open a Messenger, paste this line example in the desired place:

    messenger?.present().show()
    

    Initialization (Objective-C)

    First, import the SDK:

    @import messenger_sdk_ios;
    

    Then, in your ViewController.h:

    @property (strong, nonatomic) MessengerConfig *messengerConfig;
    @property (strong, nonatomic) DeskPro *messenger;
    

    Then, in your ViewController.m:

    - (void)viewDidLoad {
        [super viewDidLoad];
    
        self.messengerConfig = [[MessengerConfig alloc] initWithAppUrl:@"YOUR_APP_URL" appId:@"YOUR_APP_ID" appKey:@"YOUR_APP_KEY"];
        self.messenger = [[DeskPro alloc] initWithMessengerConfig:self.messengerConfig containingViewController:self enableAutologging:false];
    }
    

    Replace YOUR_APP_URL and YOUR_APP_ID with your app’s URL and ID, and YOUR_APP_KEY with you app’s KEY, or nil.

    To open a Messenger, paste this line example in the desired place:

    [[self.messenger present] show];
    

    Note: You can create multiple Messenger instances.

    Setting user info (Swift)

    messenger?.setUserInfo(user: userObject)
    

    Setting user info (Objective-C)

    [self.messenger setUserInfoWithUser:userObject];
    

    Note: User(name, firstName, lastName, email)

    Authorize user (Swift)

    messenger?.authorizeUser(jwtToken: jwtToken)
    

    Authorize user (Objective-C)

    [self.messenger authorizeUserWithUserJwt:jwtToken];
    

    Push notifications (Swift)

    messenger?.setPushRegistrationToken(token: token)
    

    Push notifications (Objective-C)

    [self.messenger setPushRegistrationTokenWithToken:token];
    

    Prerequisite: The application should be connected to the notifications platform, enabled for receiving notifications and obtaining tokens.

    Privacy

    In order to make the file upload and download fully work, make sure to add these permissions with appropriate messages in your Info.plist file:

    • Privacy – Camera Usage Description
    • Privacy – Microphone Usage Description
    • Privacy – Photo Library Additions Usage Description

    Versioning

    We use SemVer for versioning. For the versions available, see the tags on this repository.

    Visit original content creator repository https://github.com/deskpro/messenger-sdk-ios
  • DevKit

    COBI.Bike DevKit

    A collection of Open Source components to develop modules for COBI.Bike – the perfect fusion of smartphone and bike.

    COBI.Bike DevKit

    💡 Interactive Demo: Learn the fundamentals

    The quickest way to learn the DevKit basics without writing any code.

    Open demo button

    Change location coordinates and hit thumb controller buttons to see COBI.js in action. This simulates data and interaction events that will later be provided by the COBI.Bike system when riding a bike. Bonus points for directly tweaking the code e.g. subscribing to additional data from the COBI.js data stream.

    🚀 Let’s get started with your first project

    It only takes a few lines of javascript to turn Web Apps into a module:

    Step 1: Add boilerplate code

    To get your Web App ready just add COBI.js at the end of the body section of your HTML:

    <script src="https://cdn.cobi.bike/cobi.js/0.44.0/cobi.js"></script>

    and pass an authentication token to the COBI.init function before subscribing to the data stream. Tokens are not issued yet, so you can use any string for now:

    // Authenticate your module
    COBI.init('token — can be anything right now')

    It’s that easy: Any Web App + COBI.js = Module!

    Step 2: Hook into the data stream

    Enough with the boilerplate code, let’s make our new module respond to the handlebar remote control:

    COBI.hub.externalInterfaceAction.subscribe(function(action) {
      console.log('I just tapped the handlebar remote and instantly received this ' + action + ' in my Web App');
    });

    or visualize the cadence acquired by COBI.Bike app from an external Bluetooth sensor or e-bike motor:

    COBI.rideService.cadence.subscribe(function(cadence) {
        console.log('Your current cadence is ' + cadence + ' rpm.');
    });

    There is a ton of data available such as current speed, course, heart-rate (if heart-rate monitor is connected), power, calories burned and much more. Our COBI.js reference will be your friend.

    🔬 Test your module

    Now that you have supercharged your Web App, you can test your module either in the Chrome browser on your machine or directly in the COBI.Bike iOS App on your bike.

    Browser testing

    Just install the DevKit Simulator Chrome Extension, open up the Developer Tools (⌘ + Option + i / Ctrl + Shift + j) and select the »COBI.Bike« tab.
    To get the best experience, switch on the phone mode in the upper left corner and rotate the device to landscape. To simulate riding and fitness data you can play back one of our sample cobitrack or GPX files.

    On-bike testing

    If you don’t own a COBI.Bike yet, apply for a hardware development kit at cobi.bike/devkit or purchase one at get.cobi.bike. Afterwards, register as a developer to test your module on your bike.

    Become a developer button

    Ready? Then open up the COBI.Bike app on your iPhone and open the edit modules screen. As developer you can now choose from a number of examples modules or add you own via »My Module«

    When you open »My Module« on the home screen or the dashboard, you can enter the URL of your module (it can be hosted wherever you want, but we have some suggestions below). When you press »Open module« your module is loaded and hooked up to the app. Now you can easily test your idea on your 🚲.

    COBI.Bike iOS App Home COBI.Bike iOS App Edit Modules COBI.Bike iOS App My Module

    🏓 Play ping-pong with the COBI.Bike app

    Take advantage of interfaces to the native COBI.Bike app to save yourself a lot of work.

    Start a turn-by-turn navigation to a destination:

    COBI.navigationService.control.write({
      'action': 'START', 
      'destination': {'latitude': 50.110924,'longitude': 8.682127}
    });

    Open a phone number picker with the list of contacts:

    COBI.app.contact.read(function(contact) {
      console.log(contact);
    });

    Hook into the voice feedback system:

    COBI.app.textToSpeech.write({'content' : 'Can you hear my voice?', 'language' : 'en-US'});

    Claim the entire screen space by hiding the clock in the top right corner:

    COBI.app.clockVisible.write(false);

    Claim all Thumb Controller buttons on e-bikes that are reserved for motor control by default:

    COBI.devkit.overrideThumbControllerMapping.write(true);

    Check out the COBI.js reference for more.

    🎛 Settings for your Module

    A module can be shown in different contexts. There are three pieces of information that you should adapt to:

    1. The device orientation — your module should automatically adapt to the available screen size
    2. The value of COBI.parameters.context() — can be one of
    • COBI.context.onRide
    • COBI.context.offRide
    • COBI.context.onRideSettings
    • COBI.context.offRideSettings
    1. The value of COBI.app.touchInteractionEnabled — changes when you start/stop riding

    Flexible layout

    Take a look at our COBI.Bike DevKit UI Components for an easy way to create a UI for your settings.

    Module context

    There are 4 contexts you should support. You can check COBI.parameters.context() at any time to decide if some sort of settings should be shown (COBI.context.onRideSettings or COBI.context.offRideSettings) or if the actual module is requested (COBI.context.onRide or COBI.context.offRide). To share information between the two contexts and in between user sessions use the web standard Local Storage.

    Touch Interaction Changes

    While riding, the user is encouraged to use the thumb controller instead of interacting with the UI via touch. Subscribe to changes of this value to make the best out of both situations. Please note that a bar is shown at the top while touchInteractionEnabled is false — make sure it is not overlapping with your UI.

    COBI.app.touchInteractionEnabled.subscribe(function(enabled) {
        // Adapt your UI
    });

    Module Contexts

    🌈 Everything else about the DevKit

    Debugging Tips & Tricks

    • For seing javascript errors in the native App, activate the “Show module errors” option in the “Diagnostics” section
    • To get better error messages when interacting with the COBI.js API, include https://cdn.cobi.bike/cobi.js/0.44.0/cobi.dev.js instead of the script mentioned above (please note: the dev version is considerably larger which has a big impact on the loading time)
    • To show a native dialog when running inside the iOS App, just use a normal alert("your messages") (only for debugging)
    • When developing in Chrome, use the phone button in the upper left corner of the Chrome Developer Tools and rotate it to landscape to see how it looks while riding
    • When using the Chrome Simulator, press the Print state to console button to print the current COBI.js state to the Chrome Developer Tools Console
    • To change the current context append ?context=onRide or your desired context to your URL in the browser.

    Inspiration & Examples

    Interface Guidelines

    Read our Interface Guidelines to understand the unique challenges of developing software for bikes and to learn more about how the COBI.Bike system and modules work.

    More DevKit Resources

    Other Tools & Resources

    • Glitch – friendly community where you’ll build the app of your dreams
    • CodePen – social development environment for front-end designers and developers

    👏 Contributing to this project

    Anyone and everyone is welcome to contribute to this project, the DevKit Simulator and the COBI.Bike DevKit UI Components. Please take a moment to review the guidelines for contributing.

    Copyright © 2020 Robert Bosch GmbH

    Visit original content creator repository https://github.com/cobi-bike/DevKit
  • yandex-disk

    Visit original content creator repository
    https://github.com/nedobylskiy/yandex-disk

  • Episomizer

    Episomizer

    Episomizer is currently a semi-automated pipeline for constructing double minutes (aka. episome)
    using WGS data.

    Episomizer consists of two major components:

    • Bam mining extract the reads around the boundaries of highly amplified genomic regions,
      search for evidence of soft-clipped reads, discordant reads, and bridge reads that support
      putative SVs (aka. edges) between any two segment boundaries. The reported putative edges are subject
      to manual review.
    • Composer takes inputs of manually reviewed edges associated with the segment boundaries together
      with the highly amplified genomic segments, composes the segments to form simple
      cycles as candidates of circular DNA structures.

    Citation

    Xu, K., Ding, L., Chang, TC., Shao, Y., Chiang, J., Mulder, H., Wang, S., Shaw, T.I., Wen, J.,
    Hover, L., McLeod, C., Wang, YD., Easton, J., Rusch, M., Dalton, J., Downing, J.R., Ellison, D.W.,
    Zhang, J., Baker, S.J., Wu, G.
    Structure and evolution of double minutes in diagnosis and relapse brain tumors.
    Acta Neuropathologica, Sep 2018, DOI: 10.1007/s00401-018-1912-1.

    Prerequisites

    Installation

    To install, simply clone of the repository to a working directory and add $EPISOMIZER_HOME/bin to $PATH.

    $ EPISOMIZER_HOME=<path_to_working_dir>
    $ export PATH=$EPISOMIZER_HOME/bin:$PATH
    

    Usage

    Usage:
        episomizer <SUBCOMMAND> [args...]
    Subcommands:
        create_samtools_cmd    Create samtools command file to extract reads around segment boundaries
        create_softclip2fa_cmd Create command file to extract softclip reads
        create_blat_cmd        Create command file to blat softclip reads
        SV_softclip            Create read count matrix for softclip reads supported SVs
        SV_discordant          Create read count matrix for discordant reads supported SVs
        SV_bridge              Create read count matrix for bridge reads supported SVs
        matrix2edges           Convert read count matrix to putative edges
        composer               Compose segments and edges to identify circular DNA structures
    

    For details on how to run the semi-automated pipeline, see the following Procedure section.
    For a concrete example of constructing double minutes on a mini-bam file, see examples page.

    Procedure

    Step 1: Determine a threshold for highly amplified genomic segments based on the empirical distribution
    of Log2Ratio of copy number data.

    Step 2: Get the putative edges.

    1. Generate the shell script with samtools commands to extract the reads around segment boundaries.

      $ episomizer create_samtools_cmd INPUT_BAM INPUT_CNA_BED OUTPUT_DIR
      

      Run the shell script.

      $ OUTPUT_DIR/run_samtools.sh 
      
    2. Generate the shell script to extract softclip reads.

      $ episomizer create_softclip2fa_cmd INPUT_CNA_BED OUTPUT_DIR
      

      Run the shell script.

      $ OUTPUT_DIR/run_softclip2fa.sh
      
    3. Generate the shell script to blat the softclip reads.

      $ episomizer create_blat_cmd REF_GENOME_BIT INPUT_CNA_BED OUTPUT_DIR
      

      The reference genome GRCh37-lite.2bit can be downloaded from
      St. Jude public FTP site and can be placed under the working directory.

      Run the shell script (submitting the jobs in parallel is strongly recommended).

      $ OUTPUT_DIR/run_BLAT.sh
      
    4. Create 3 read count matrices using softclip reads, discordant reads and bridging discordant reads.

      $ episomizer SV_softclip INPUT_CNA_BED FLANK SOFTCLIP_BLAT_DIR OUTPUT_DIR
      $ episomizer SV_discordant INPUT_CNA_BED TLEN FLANK BOUNDARY_READS_DIR OUTPUT_DIR
      $ episomizer SV_bridge INPUT_CNA_BED TLEN DISTANCE BOUNDARY_READS_DIR OUTPUT_DIR
      
    5. Convert matrix file to edges file.

      $ episomizer matrix2edges INPUT_CNA_BED MATRIX_FILE OUTPUT_EDGE_FILE
      

    Step 3: Manually review the putative edges.

    For all 3 types of putative edges (softclip, discordant, and bridge) from the above output, they are sorted by the total number of supporting reads from high to low.
    These putative edges need to be manually reviewed by examining the boundary reads and their Blat results in the “trace” folder,
    as well as by examining the coverage depth around the segments on IGV to refine the segment boundaries. Edges with few reads support
    on each side are usually spurious or indicate minor clones which we do not consider in our current study. The basic review process is described below:

    For the softclip edges, the edges that represent adjacent segments should be annotated first. Then for the rest of the
    edges, their Blat output need to be manually reviewed to determine the orientations of two joined segments such that
    true edge will be annotated and accompanying false edges will be removed.

    For the discordant edges, the edges that are already reviewed in the softclip edges can be removed first. Then for the
    rest of the edges, their boundary reads (in SAM format) need to be manually reviewed, especially the FLAG column, to determine
    the orientations of two joined segments such that true edge will be annotated and accompanying false edges will be removed.

    For the bridge edges, the edges that are already reviewed in the softclip and discordant edges can be removed first.
    Then for the rest of the edges, their boundary reads (in SAM format) need to be manually reviewed to determine if their bridging segment’s
    orientation is in harmony with those of the two joined segments. True edge will be annotated and accompanying false edges will be removed.
    In addition, if the reads support from both sides are off balance (for example, AtoB is 1 but BtoA is 100), the edge is most likely to be spurious.

    The reviewed edges from the 3 putative edges files are combined into one “edge” file as part of the input for the next step.

    Step 4: Compose circular double minute structures.

    $ episomizer composer circ -c REVIEWED_SEGMENTS -l REVIEWED_EDGES -d OUTPUT_DOUBLE_MINUTES
    

    Maintainers

    Visit original content creator repository
    https://github.com/stjude/Episomizer

  • Episomizer

    Episomizer

    Episomizer is currently a semi-automated pipeline for constructing double minutes (aka. episome)
    using WGS data.

    Episomizer consists of two major components:

    • Bam mining extract the reads around the boundaries of highly amplified genomic regions,
      search for evidence of soft-clipped reads, discordant reads, and bridge reads that support
      putative SVs (aka. edges) between any two segment boundaries. The reported putative edges are subject
      to manual review.
    • Composer takes inputs of manually reviewed edges associated with the segment boundaries together
      with the highly amplified genomic segments, composes the segments to form simple
      cycles as candidates of circular DNA structures.

    Citation

    Xu, K., Ding, L., Chang, TC., Shao, Y., Chiang, J., Mulder, H., Wang, S., Shaw, T.I., Wen, J.,
    Hover, L., McLeod, C., Wang, YD., Easton, J., Rusch, M., Dalton, J., Downing, J.R., Ellison, D.W.,
    Zhang, J., Baker, S.J., Wu, G.
    Structure and evolution of double minutes in diagnosis and relapse brain tumors.
    Acta Neuropathologica, Sep 2018, DOI: 10.1007/s00401-018-1912-1.

    Prerequisites

    Installation

    To install, simply clone of the repository to a working directory and add $EPISOMIZER_HOME/bin to $PATH.

    $ EPISOMIZER_HOME=<path_to_working_dir>
    $ export PATH=$EPISOMIZER_HOME/bin:$PATH
    

    Usage

    Usage:
        episomizer <SUBCOMMAND> [args...]
    Subcommands:
        create_samtools_cmd    Create samtools command file to extract reads around segment boundaries
        create_softclip2fa_cmd Create command file to extract softclip reads
        create_blat_cmd        Create command file to blat softclip reads
        SV_softclip            Create read count matrix for softclip reads supported SVs
        SV_discordant          Create read count matrix for discordant reads supported SVs
        SV_bridge              Create read count matrix for bridge reads supported SVs
        matrix2edges           Convert read count matrix to putative edges
        composer               Compose segments and edges to identify circular DNA structures
    

    For details on how to run the semi-automated pipeline, see the following Procedure section.
    For a concrete example of constructing double minutes on a mini-bam file, see examples page.

    Procedure

    Step 1: Determine a threshold for highly amplified genomic segments based on the empirical distribution
    of Log2Ratio of copy number data.

    Step 2: Get the putative edges.

    1. Generate the shell script with samtools commands to extract the reads around segment boundaries.

      $ episomizer create_samtools_cmd INPUT_BAM INPUT_CNA_BED OUTPUT_DIR
      

      Run the shell script.

      $ OUTPUT_DIR/run_samtools.sh 
      
    2. Generate the shell script to extract softclip reads.

      $ episomizer create_softclip2fa_cmd INPUT_CNA_BED OUTPUT_DIR
      

      Run the shell script.

      $ OUTPUT_DIR/run_softclip2fa.sh
      
    3. Generate the shell script to blat the softclip reads.

      $ episomizer create_blat_cmd REF_GENOME_BIT INPUT_CNA_BED OUTPUT_DIR
      

      The reference genome GRCh37-lite.2bit can be downloaded from
      St. Jude public FTP site and can be placed under the working directory.

      Run the shell script (submitting the jobs in parallel is strongly recommended).

      $ OUTPUT_DIR/run_BLAT.sh
      
    4. Create 3 read count matrices using softclip reads, discordant reads and bridging discordant reads.

      $ episomizer SV_softclip INPUT_CNA_BED FLANK SOFTCLIP_BLAT_DIR OUTPUT_DIR
      $ episomizer SV_discordant INPUT_CNA_BED TLEN FLANK BOUNDARY_READS_DIR OUTPUT_DIR
      $ episomizer SV_bridge INPUT_CNA_BED TLEN DISTANCE BOUNDARY_READS_DIR OUTPUT_DIR
      
    5. Convert matrix file to edges file.

      $ episomizer matrix2edges INPUT_CNA_BED MATRIX_FILE OUTPUT_EDGE_FILE
      

    Step 3: Manually review the putative edges.

    For all 3 types of putative edges (softclip, discordant, and bridge) from the above output, they are sorted by the total number of supporting reads from high to low.
    These putative edges need to be manually reviewed by examining the boundary reads and their Blat results in the “trace” folder,
    as well as by examining the coverage depth around the segments on IGV to refine the segment boundaries. Edges with few reads support
    on each side are usually spurious or indicate minor clones which we do not consider in our current study. The basic review process is described below:

    For the softclip edges, the edges that represent adjacent segments should be annotated first. Then for the rest of the
    edges, their Blat output need to be manually reviewed to determine the orientations of two joined segments such that
    true edge will be annotated and accompanying false edges will be removed.

    For the discordant edges, the edges that are already reviewed in the softclip edges can be removed first. Then for the
    rest of the edges, their boundary reads (in SAM format) need to be manually reviewed, especially the FLAG column, to determine
    the orientations of two joined segments such that true edge will be annotated and accompanying false edges will be removed.

    For the bridge edges, the edges that are already reviewed in the softclip and discordant edges can be removed first.
    Then for the rest of the edges, their boundary reads (in SAM format) need to be manually reviewed to determine if their bridging segment’s
    orientation is in harmony with those of the two joined segments. True edge will be annotated and accompanying false edges will be removed.
    In addition, if the reads support from both sides are off balance (for example, AtoB is 1 but BtoA is 100), the edge is most likely to be spurious.

    The reviewed edges from the 3 putative edges files are combined into one “edge” file as part of the input for the next step.

    Step 4: Compose circular double minute structures.

    $ episomizer composer circ -c REVIEWED_SEGMENTS -l REVIEWED_EDGES -d OUTPUT_DOUBLE_MINUTES
    

    Maintainers

    Visit original content creator repository
    https://github.com/stjude/Episomizer

  • bittube-blockchain-explorer

    BitTube Blockchain Explorer

    Based on Onion Bittube Blockchain Explorer

    Explorer hosts

    Testnet version:

    BitTube Blockchain Explorer features

    The key features of the BitTube Blockchain Explorer are:

    • no cookies, no web analytics trackers, no images,
    • open sourced,
    • made fully in C++,
    • showing encrypted payments ID,
    • showing ring signatures,
    • showing transaction extra field,
    • showing public components of BitTube addresses,
    • decoding which outputs and mixins belong to the given BitTube address and viewkey,
    • can prove that you send BitTube to someone,
    • detailed information about ring members, such as, their age, timescale and their ring sizes,
    • showing number of amount output indices,
    • support BitTube testnet and stagnet networks,
    • tx checker and pusher for online pushing of transactions,
    • estimate possible spendings based on address and viewkey,
    • can provide total amount of all miner fees,
    • decoding encrypted payment id,
    • decoding outputs and proving txs sent to sub-address.
    • listing RandomX code for each block

    Compilation on Ubuntu 16.04/18.04

    Compile latest BitTube development version

    Download and compile recent BitTube into your home folder:

    # first install BitTube dependecines
    sudo apt update
    
    sudo apt install git build-essential cmake libboost-all-dev miniupnpc libunbound-dev graphviz doxygen libunwind8-dev pkg-config libssl-dev libcurl4-openssl-dev libgtest-dev libreadline-dev libzmq3-dev libsodium-dev libhidapi-dev libhidapi-libusb0
    
    # go to home folder
    cd ~
    
    git clone --recursive https://github.com/ipbc-dev/bittube.git bittube
    
    cd bittube/
    
    https://github.com/bittubeexamples/bittube-compilation/blob/master/README.md
    
    ##### Compile and run the explorer
    
    Once the BitTube is compiled, the explorer can be downloaded and compiled
    as follows:
    
    ```bash
    # go to home folder if still in ~/bittube
    cd ~
    
    # download the source code
    git clone https://github.com/ipbc-dev/bittube-blockchain-explorer.git
    
    # enter the downloaded sourced code folder
    cd bittube-blockchain-explorer
    
    # make a build folder and enter it
    mkdir build && cd build
    
    # create the makefile
    cmake ..
    
    # alternatively you can use: cmake -DBITTUBE_DIR=/path/to/bittube_folder ..
    # if BitTube is not in ~/bittube
    #
    # also can build with ASAN (sanitizers), for example
    # cmake -DSANITIZE_ADDRESS=On ..
    
    # compile
    make

    To run it:

    ./bittube-blockchain-explorer
    

    By default it will look for blockchain in its default location i.e., ~/.bittube/lmdb.
    You can use -b option if its in different location.

    For example:

    ./bittube-blockchain-explorer -b /home/mwo/non-default-bittube-location/lmdb/

    Example output:

    [mwo@arch bittube-blockchain-explorer]$ ./bittube-blockchain-explorer
    2016-May-28 10:04:49.160280 Blockchain initialized. last block: 1056761, d0.h0.m12.s47 time ago, current difficulty: 1517857750
    (2016-05-28 02:04:49) [INFO    ] Crow/0.1 server is running, local port 8081

    Go to your browser: http://127.0.0.1:8081

    The explorer’s command line options

    bittube-blockchain-explorer, BitTube Blockchain Explorer:
      -h [ --help ] [=arg(=1)] (=0)         produce help message
      -t [ --testnet ] [=arg(=1)] (=0)      use testnet blockchain
      -s [ --stagenet ] [=arg(=1)] (=0)     use stagenet blockchain
      --enable-pusher [=arg(=1)] (=0)       enable signed transaction pusher
      --enable-randomx [=arg(=1)] (=0)      enable generation of randomx code
      --enable-mixin-details [=arg(=1)] (=0)
                                            enable mixin details for key images,
                                            e.g., timescale, mixin of mixins, in tx
                                            context
      --enable-key-image-checker [=arg(=1)] (=0)
                                            enable key images file checker
      --enable-output-key-checker [=arg(=1)] (=0)
                                            enable outputs key file checker
      --enable-json-api [=arg(=1)] (=0)     enable JSON REST api
      --enable-as-hex [=arg(=1)] (=0)       enable links to provide hex
                                            represtations of a tx and a block
      --enable-autorefresh-option [=arg(=1)] (=0)
                                            enable users to have the index page on
                                            autorefresh
      --enable-emission-monitor [=arg(=1)] (=0)
                                            enable BitTube total emission monitoring
                                            thread
      -p [ --port ] arg (=8081)             default explorer port
      -x [ --bindaddr ] arg (=0.0.0.0)      default bind address for the explorer
      --testnet-url arg                     you can specify testnet url, if you run
                                            it on mainnet or stagenet. link will
                                            show on front page to testnet explorer
      --stagenet-url arg                    you can specify stagenet url, if you
                                            run it on mainnet or testnet. link will
                                            show on front page to stagenet explorer
      --mainnet-url arg                     you can specify mainnet url, if you run
                                            it on testnet or stagenet. link will
                                            show on front page to mainnet explorer
      --no-blocks-on-index arg (=10)        number of last blocks to be shown on
                                            index page
      --mempool-info-timeout arg (=5000)    maximum time, in milliseconds, to wait
                                            for mempool data for the front page
      --mempool-refresh-time arg (=5)       time, in seconds, for each refresh of
                                            mempool state
      -c [ --concurrency ] arg (=0)         number of threads handling http
                                            queries. Default is 0 which means it is
                                            based you on the cpu
      -b [ --bc-path ] arg                  path to lmdb folder of the blockchain,
                                            e.g., ~/.bittube/lmdb
      --ssl-crt-file arg                    path to crt file for ssl (https)
                                            functionality
      --ssl-key-file arg                    path to key file for ssl (https)
                                            functionality
      -d [ --deamon-url ] arg (=http:://127.0.0.1:24182)
                                            Monero daemon url
      --daemon-login arg                    Specify username[:password] for daemon 
                                            RPC client
    

    Example usage, defined as bash aliases.

    # for mainnet explorer
    alias bittube-blockchain-explorer-mainnet='~/bittube-blockchain-explorer/build/bittube-blockchain-explorer    --port 8081 --testnet-url "http://139.162.32.245:8082" --enable-pusher --enable-emission-monitor'
    
    # for testnet explorer
    alias bittube-blockchain-explorer-testnet='~/bittube-blockchain-explorer/build/bittube-blockchain-explorer -t --port 8082 --mainnet-url "http://139.162.32.245:8081" --enable-pusher --enable-emission-monitor'

    Enable BitTube emission

    Obtaining current BitTube emission amount is not straight forward. Thus, by default it is
    disabled. To enable it use --enable-emission-monitor flag, e.g.,

    bittube-blockchain-explorer --enable-emission-monitor

    This flag will enable emission monitoring thread. When started, the thread
    will initially scan the entire blockchain, and calculate the cumulative emission based on each block.
    Since it is a separate thread, the explorer will work as usual during this time.
    Every 10000 blocks, the thread will save current emission in a file, by default,
    in ~/.bittube/lmdb/emission_amount.txt. For testnet or stagenet networks,
    it is ~/.bittube/testnet/lmdb/emission_amount.txt or ~/.bittube/stagenet/lmdb/emission_amount.txt. This file is used so that we don’t
    need to rescan entire blockchain whenever the explorer is restarted. When the
    explorer restarts, the thread will first check if ~/.bittube/lmdb/emission_amount.txt
    is present, read its values, and continue from there if possible. Subsequently, only the initial
    use of the tread is time consuming. Once the thread scans the entire blockchain, it updates
    the emission amount using new blocks as they come. Since the explorer writes this file, there can
    be only one instance of it running for mainnet, testnet and stagenet. Thus, for example, you cant have
    two explorers for mainnet
    running at the same time, as they will be trying to write and read the same file at the same time,
    leading to unexpected results. Off course having one instance for mainnet and one instance for testnet
    is fine, as they write to different files.

    When the emission monitor is enabled, information about current emission of coinbase and fees is
    displayed on the front page, e.g., :

    BitTube emission (fees) is 14485540.430 (52545.373) as of 1313448 block
    

    The values given, can be checked using BitTube daemon’s print_coinbase_tx_sum command.
    For example, for the above example: print_coinbase_tx_sum 0 1313449.

    To disable the monitor, simply restart the explorer without --enable-emission-monitor flag.

    Enable SSL (https)

    By default, the explorer does not use ssl. But it has such a functionality.

    As an example, you can generate your own ssl certificates as follows:

    cd /tmp # example folder
    openssl genrsa -out server.key 1024
    openssl req -new -key server.key -out server.csr
    openssl x509 -req -days 3650 -in server.csr -signkey server.key -out server.crt

    Having the crt and key files, run bittube-blockchain-explorer in the following way:

    ./bittube-blockchain-explorer --ssl-crt-file=/tmp/server.crt --ssl-key-file=/tmp/server.key

    Note: Because we generated our own certificate, modern browsers will complain
    about it as they cant verify the signatures against any third party. So probably
    for any practical use need to have properly issued ssl certificates.

    JSON API

    The explorer has JSON api. For the API, it uses conventions defined by JSend.
    By default the api is disabled. To enable it, use --enable-json-api flag, e.g.,

    ./bittube-blockchain-explorer --enable-json-api
    

    api/transaction/<tx_hash>

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/transaction/6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d"

    Partial results shown:

    {
      "data": {
        "block_height": 1268252,
        "coinbase": false,
        "confirmations": 1,
        "current_height": 1268253,
        "extra": "01be23e277aed6b5f41f66b05244bf994c13108347366ec678ae16657f0fc3a22b",
        "inputs": [
          {
            "amount": 0,
            "key_image": "67838fd0ffd79f13e735830d3ec60412aed59e53e1f997feb6f73d088b949611",
            "mixins": [
              {
                "block_no": 1238623,
                "public_key": "0a5b853c55303c10e1326acfb085b9e246e088b1ccac7e37f7a810d46a28a914"
              },
              {
                "block_no": 1246942,
                "public_key": "527cf86f5abbfb006c970f7c6eb40493786d4751306f8985c6a43f98a88c0dff"
              }
            ]
          }
        ],
        "mixin": 9,
        "outputs": [
          {
            "amount": 0,
            "public_key": "525779873776e4a42f517fd79b72e7c31c3ba03e730fc32287f6414fb702c1d7"
          },
          {
            "amount": 0,
            "public_key": "e25f00fceb77af841d780b68647618812695b4ca6ebe338faba6e077f758ac30"
          }
        ],
        "payment_id": "",
        "payment_id8": "",
        "rct_type": 1,
        "timestamp": 1489753456,
        "timestamp_utc": "2017-03-17 12:24:16",
        "tx_fee": 12517785574,
        "tx_hash": "6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d",
        "tx_size": 13323,
        "tx_version": 2,
        "xmr_inputs": 0,
        "xmr_outputs": 0
      },
      "status": "success"
    }

    api/transactions

    Transactions in last 25 blocks

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/transactions"

    Partial results shown:

    {
      "data": {
        "blocks": [
          {
            "age": "33:16:49:53",
            "height": 1268252,
            "size": 105390000000000000,
            "timestamp": 1489753456,
            "timestamp_utc": "2017-03-17 12:24:16",
            "txs": [
              {
                "coinbase": true,
                "mixin": 0,
                "outputs": 8491554678365,
                "rct_type": 0,
                "tx_fee": 0,
                "tx_hash": "7c4286f64544568265bb5418df84ae69afaa3567749210e46f8340c247f4803f",
                "tx_size": 151000000000000,
                "tx_version": 2
              },
              {
                "coinbase": false,
                "mixin": 5,
                "outputs": 0,
                "rct_type": 2,
                "tx_fee": 17882516700,
                "tx_hash": "2bfbccb918ee5f050808dd040ce03943b7315b81788e9cdee59cf86b557ba48c",
                "tx_size": 19586000000000000,
                "tx_version": 2
              }
            ]
          }
        ],
        "limit": 25,
        "page": 0
      },
      "status": "success"
    }

    api/transactions?page=<page_no>&limit=<tx_per_page>

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/transactions?page=2&limit=10"

    Result analogical to the one above.

    api/block/<block_number|block_hash>

    curl  -w "\n" -X GET "http://139.162.32.245:8081/api/block/1293257"

    Partial results shown:

    {
      "data": {
        "block_height": 1293257,
        "block_reward": 0,
        "current_height": 1293264,
        "hash": "9ef6bb8f9b8bd253fc6390e5c2cdc45c8ee99fad16447437108bf301fe6bd6e1",
        "size": 141244,
        "timestamp": 1492761974,
        "timestamp_utc": "2017-04-21 08:06:14",
        "txs": [
          {
            "coinbase": true,
            "extra": "018ae9560eb85d5ebd22d3beaed55c21d469eab430c5e3cac61b3fe2f5ad156770020800000001a9030800",
            "mixin": 0,
            "payment_id": "",
            "payment_id8": "",
            "rct_type": 0,
            "tx_fee": 0,
            "tx_hash": "3ff71b65bec34c9261e01a856e6a03594cf0472acf6b77db3f17ebd18eaa30bf",
            "tx_size": 95,
            "tx_version": 2,
            "xmr_inputs": 0,
            "xmr_outputs": 8025365394426
          }
        ]
      },
      "status": "success"
    }

    api/mempool

    Return all txs in the mempool.

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/mempool"

    Partial results shown:

    {
      "data": {
        "limit": 100000000,
        "page": 0,
        "total_page_no": 0,
        "txs": [
          {
            "coinbase": false,
            "extra": "022100325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2012eda81bf552c53c2168f4130dbe0265c3a7898f3a7eee7c1fed955a778167b5d",
            "mixin": 3,
            "payment_id": "325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2",
            "payment_id8": "",
            "rct_type": 2,
            "timestamp": 1494470894,
            "timestamp_utc": "2017-05-11 02:48:14",
            "tx_fee": 15894840000,
            "tx_hash": "9f3374f8ac67febaab153eab297937a3d0d2c706601e496bf5028146da0c9aef",
            "tx_size": 13291,
            "tx_version": 2,
            "xmr_inputs": 0,
            "xmr_outputs": 0
          }
        ],
        "txs_no": 7
      },
      "status": "success"
    }

    Limit of 100000000 is just default value above to ensure that all mempool txs are fetched
    if no specific limit given.

    api/mempool?limit=<no_of_top_txs>

    Return number of newest mempool txs, e.g., only 10.

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/mempool?limit=10"

    Result analogical to the one above.

    api/search/<block_number|tx_hash|block_hash>

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/search/1293669"

    Partial results shown:

    {
      "data": {
        "block_height": 1293669,
        "current_height": 1293670,
        "hash": "5d55b8fabf85b0b4c959d66ad509eb92ddfe5c2b0e84e1760abcb090195c1913",
        "size": 118026,
        "timestamp": 1492815321,
        "timestamp_utc": "2017-04-21 22:55:21",
        "title": "block",
        "txs": [
          {
            "coinbase": true,
            "extra": "01cb7fda09033a5fa06dc601b9295ef3790397cf3c645e958e34cf7ab699d2f5230208000000027f030200",
            "mixin": 0,
            "payment_id": "",
            "payment_id8": "",
            "rct_type": 0,
            "tx_fee": 0,
            "tx_hash": "479ba432f5c88736b438dd4446a11a13046a752d469f7828151f5c5b86be4e9a",
            "tx_size": 95,
            "tx_version": 2,
            "xmr_inputs": 0,
            "xmr_outputs": 7992697599717
          }
        ]
      },
      "status": "success"
    }

    api/outputs?txhash=<tx_hash>&address=&viewkey=&txprove=<0|1>

    For txprove=0 we check which outputs belong to given address and corresponding viewkey.
    For txprove=1 we use to prove to the recipient that we sent them founds.
    For this, we use recipient’s address and our tx private key as a viewkey value,
    i.e., viewkey=<tx_private_key>

    Checking outputs:

    # we use here official BitTube project's donation address as an example
    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/outputs?txhash=17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831&address=44AFFq5kSiGBoZ4NMDwYtN18obc8AemS33DBLWs3H7otXft3XjrpDtQGv7SqSsaBYBb98uNbr2VBBEt7f2wfn3RVGQBEP3A&viewkey=f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501&txprove=0"

    {
      "data": {
        "address": "42f18fc61586554095b0799b5c4b6f00cdeb26a93b20540d366932c6001617b75db35109fbba7d5f275fef4b9c49e0cc1c84b219ec6ff652fda54f89f7f63c88",
        "outputs": [
          {
            "amount": 34980000000000,
            "match": true,
            "output_idx": 0,
            "output_pubkey": "35d7200229e725c2bce0da3a2f20ef0720d242ecf88bfcb71eff2025c2501fdb"
          },
          {
            "amount": 0,
            "match": false,
            "output_idx": 1,
            "output_pubkey": "44efccab9f9b42e83c12da7988785d6c4eb3ec6e7aa2ae1234e2f0f7cb9ed6dd"
          }
        ],
        "tx_hash": "17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831",
        "tx_prove": false,
        "viewkey": "f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501"
      },
      "status": "success"
    }

    Proving transfer:

    We use recipient’s address (i.e. not our address from which we sent xmr to recipient).
    For the viewkey, we use tx_private_key (although the GET variable is still called viewkey) that we obtained by sending this txs.

    # this is for testnet transaction
    curl  -w "\n" -X GET "http://127.0.0.1:8082/api/outputs?txhash=94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3&address=9wUf8UcPUtb2huK7RphBw5PFCyKosKxqtGxbcKBDnzTCPrdNfJjLjtuht87zhTgsffCB21qmjxjj18Pw7cBnRctcKHrUB7N&viewkey=e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b&txprove=1"

    {
      "data": {
        "address": "71bef5945b70bc0a31dbbe6cd0bd5884fe694bbfd18fff5f68f709438554fb88a51b1291e378e2f46a0155108782c242cc1be78af229242c36d4f4d1c4f72da2",
        "outputs": [
          {
            "amount": 1000000000000,
            "match": true,
            "output_idx": 0,
            "output_pubkey": "c1bf4dd020b5f0ab70bd672d2f9e800ea7b8ab108b080825c1d6cfc0b7f7ee00"
          },
          {
            "amount": 0,
            "match": false,
            "output_idx": 1,
            "output_pubkey": "8c61fae6ada2a103565dfdd307c7145b2479ddb1dab1eaadfa6c34db65d189d5"
          }
        ],
        "tx_hash": "94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3",
        "tx_prove": true,
        "viewkey": "e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b"
      },
      "status": "success"
    }

    Result analogical to the one above.

    api/networkinfo

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/networkinfo"

    {
      "data": {
        "alt_blocks_count": 0,
        "block_size_limit": 600000,
        "cumulative_difficulty": 2091549555696348,
        "difficulty": 7941560081,
        "fee_per_kb": 303970000,
        "grey_peerlist_size": 4991,
        "hash_rate": 66179667,
        "height": 1310423,
        "incoming_connections_count": 0,
        "outgoing_connections_count": 5,
        "start_time": 1494822692,
        "status": "OK",
        "target": 120,
        "target_height": 0,
        "testnet": false,
        "top_block_hash": "76f9e85d62415312758bc09e0b9b48fd2b005231ad1eee435a8081e551203f82",
        "tx_count": 1219048,
        "tx_pool_size": 2,
        "white_peerlist_size": 1000
      },
      "status": "success"
    }

    api/outputsblocks

    Search for our outputs in last few blocks (up to 5 blocks), using provided address and viewkey.

    # testnet address
    curl  -w "\n" -X GET http://127.0.0.1:8081/api/outputsblocks?address=9sDyNU82ih1gdhDgrqHbEcfSDFASjFgxL9B9v5f1AytFUrYsVEj7bD9Pyx5Sw2qLk8HgGdFM8qj5DNecqGhm24Ce6QwEGDi&viewkey=807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a&limit=5&mempool=1

    Example result:

    {
      "data": {
        "address": "0182d5be0f708cecf2b6f9889738bde5c930fad846d5b530e021afd1ae7e24a687ad50af3a5d38896655669079ad0163b4a369f6c852cc816dace5fc7792b72f",
        "height": 960526,
        "limit": "5",
        "mempool": true,
        "outputs": [
          {
            "amount": 33000000000000,
            "block_no": 0,
            "in_mempool": true,
            "output_idx": 1,
            "output_pubkey": "2417b24fc99b2cbd9459278b532b37f15eab6b09bbfc44f9d17e15cd25d5b44f",
            "payment_id": "",
            "tx_hash": "9233708004c51d15f44e86ac1a3b99582ed2bede4aaac6e2dd71424a9147b06f"
          },
          {
            "amount": 2000000000000,
            "block_no": 960525,
            "in_mempool": false,
            "output_idx": 0,
            "output_pubkey": "9984101f5471dda461f091962f1f970b122d4469077aed6b978a910dc3ed4576",
            "payment_id": "0000000000000055",
            "tx_hash": "37825d0feb2e96cd10fa9ec0b990ac2e97d2648c0f23e4f7d68d2298996acefd"
          },
          {
            "amount": 96947454120000,
            "block_no": 960525,
            "in_mempool": false,
            "output_idx": 1,
            "output_pubkey": "e4bded8e2a9ec4d41682a34d0a37596ec62742b28e74b897fcc00a47fcaa8629",
            "payment_id": "0000000000000000000000000000000000000000000000000000000000001234",
            "tx_hash": "4fad5f2bdb6dbd7efc2ce7efa3dd20edbd2a91640ce35e54c6887f0ee5a1a679"
          }
        ],
        "viewkey": "807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a"
      },
      "status": "success"
    }

    api/emission

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/emission"

    {
      "data": {
        "blk_no": 1313969,
        "coinbase": 14489473877253413000,
        "fee": 52601974988641130
      },
      "status": "success"
    }

    Emission only works when the emission monitoring thread is enabled.

    api/version

    curl  -w "\n" -X GET "http://127.0.0.1:8081/api/version"

    {
      "data": {
        "api": 65536,
        "blockchain_height": 1357031,
        "git_branch_name": "update_to_current_bittube",
        "last_git_commit_date": "2017-07-25",
        "last_git_commit_hash": "a549f25",
        "bittube_version_full": "0.10.3.1-ab594cfe"
      },
      "status": "success"
    }

    api number is store as uint32_t. In this case 65536 represents
    major version 1 and minor version 0.
    In JavaScript to get these numbers, one can do as follows:

    var api_major = response.data.api >> 16;
    var api_minor = response.data.api & 0xffff;

    api/rawblock/<block_number|block_hash>

    Return raw json block data, as represented in BitTube.

    curl  -w "\n" -X GET "http://139.162.32.245:8081/api/rawblock/1293257"

    Example result not shown.

    api/rawtransaction/<tx_hash>

    Return raw json tx data, as represented in BitTube.

    curl  -w "\n" -X GET "http://139.162.32.245:8081/api/rawtransaction/6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d"

    Example result not shown.

    How can you help?

    Constructive criticism, code and website edits are always good. They can be made through github.

    Visit original content creator repository
    https://github.com/ipbc-dev/bittube-blockchain-explorer