initial commit

\
This commit is contained in:
Nathan Wang 2021-10-19 17:53:06 -05:00
commit cfafe34339
124 changed files with 11607 additions and 0 deletions

68
.github/CONTRIBUTING.md vendored Normal file
View File

@ -0,0 +1,68 @@
# Contributing to the FTC SDK
The following is a set of guidelines for contributing the FIRST FTC SDK. The FTC Technology Team welcomes suggestions for improvements to core software, ideas for new features, requests for built-in support of new sensors, and well written bug reports.
## How can I contribute?
### Pull requests
__STOP!__ If you are new to git, do not understand the mechanics of forks, branches, and pulls, if what you just read is confusing, __do not__ push this button. Most likely it won't do what you think it will.
![Pull Button](../doc/media/PullRequest.PNG)
If you are looking at this button then you've pushed some changes to your team's fork of ftctechnh/ftc_app. Congratulations! You are almost certainly finished.
The vast majority of pull requests seen on the ftctechnh/ftc_app repository are not intended to be merged into the official SDK. Team software is just that, your team's. It's specific to the tasks you are trying to accomplish, the testing you are doing, and goals your team has. You don't want that pushed into the official SDK.
If what you've read so far makes little sense, there are some very good git learning resources online.
[Git Book](https://git-scm.com/book/en/v2)
[Interactive Git Tutorial](https://try.github.io)
##### Guidlines for experienced GIT users.
If you are absolutely certain that you want to push the big green button above, read on. Otherwise back _slowly away from keyboard_.
The real intent for advanced users is often to issue a pull request from the [branch](https://www.atlassian.com/git/tutorials/using-branches/git-branch) on a local fork back to master on either the same local fork or a child of the team fork and not on the parent ftctechnh/ftc_app. See [Creating a Pull Request](https://help.github.com/articles/creating-a-pull-request-from-a-fork/).
If that is indeed the intent, then you can merge your [topic branch](https://git-scm.com/book/en/v2/Git-Branching-Branching-Workflows#Topic-Branches) into master locally by hand before pushing it up to github, or if you want a pull request for pulls between branches on the same repository because, say, you want team members to look at your software before merging into master, you can select the base fork from the dropdown on the "Open a pull request" page and select your team repo instead of ftctechnh's.
Alternatively, if you have a team repository forked from ftctechnh/ftc_app, and then team members individually fork from your team repository, then pull requests from the individual team member's forks will have the main team repository automatically selected as the base fork for the pull. And you won't inadvertently request to pull your team software into ftctechnh's repository.
The latter would be the "best" way to manage software among a large team. But as with all things git there are many options.
Pull requests that do not fall into the category above are evaluated by the FTC Technology Team on a case-by-case basis. Please note however that the deployment model of the SDK does not support direct pulls into ftctechnh/ftc_app.
### Report bugs
This section guides you through filing a bug report. The better the report the more likely it is to be root caused and fixed. Please refrain from feature requests or software enhancements when opening new issues. See Suggesting Enhancements below.
#### Before submitting a bug report
- Check the [forums](http://ftcforum.usfirst.org/forum.php) to see if someone else has run into the problem and whether there is an official solution that doesn't require a new SDK.
- Perform a search of current [issues](https://github.com/ftctechnh/ftc_app/issues) to see if the problem has already been reported. If so, add a comment to the existing issue instead of creating a new one.
#### How Do I Submit A (Good) Bug Report?
Bugs are tracked as GitHub issues. Create an issue on ftctechnh/ftc_app and provide the following information.
Explain the problem and include additional details to help maintainers reproduce the problem:
- Use a clear and descriptive title for the issue to identify the problem.
- Describe the exact steps which reproduce the problem in as many details as possible.
- Provide specific examples to demonstrate the steps.
- Describe the behavior you observed after following the steps and point out what exactly is the problem with that behavior. Explain which behavior you expected to see instead and why. If applicable, include screenshots which show you following the described steps and clearly demonstrate the problem.
- If you're reporting that the RobotController crashed, include the logfile with a stack trace of the crash. [Example of good bug report with stack trace](https://github.com/ftctechnh/ftc_app/issues/224)
- If the problem wasn't triggered by a specific action, describe what you were doing before the problem happened and share more information using the guidelines below.
### Suggesting Enhancements
FIRST volunteers are awesome. You all have great ideas and we want to hear them.
Enhancements should be broadly applicable to a large majority of teams, should not force teams to change their workflow, and should provide real value to the mission of FIRST as it relates to engaging youth in engineering activities.
The best way to get momentum behind new features is to post a description of your idea in the forums. Build community support for it. The FTC Technology Team monitors the forums. We'll hear you and if there's a large enough call for the feature it's very likely to get put on the list for a future release.

1
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@ -0,0 +1 @@
Before issuing a pull request, please see the contributing page.

87
.gitignore vendored Normal file
View File

@ -0,0 +1,87 @@
# Built application files
*.apk
*.aar
*.ap_
*.aab
!libs/*.aar
# Files for the ART/Dalvik VM
*.dex
# Java class files
*.class
# Generated files
bin/
gen/
out/
# Uncomment the following line in case you need and you don't have the release build type files in your app
# release/
# Gradle files
.gradle/
build/
# Local configuration file (sdk path, etc)
local.properties
# Proguard folder generated by Eclipse
proguard/
# Log Files
*.log
# Android Studio Navigation editor temp files
.navigation/
# Android Studio captures folder
captures/
# IntelliJ
*.iml
.idea/workspace.xml
.idea/tasks.xml
.idea/gradle.xml
.idea/assetWizardSettings.xml
.idea/dictionaries
.idea/libraries
# Android Studio 3 in .gitignore file.
.idea/caches
.idea/modules.xml
# Comment next line if keeping position of elements in Navigation Editor is relevant for you
.idea/navEditor.xml
# Keystore files
# Uncomment the following lines if you do not want to check your keystore files in.
#*.jks
#*.keystore
# External native build folder generated in Android Studio 2.2 and later
.externalNativeBuild
.cxx/
# Google Services (e.g. APIs or Firebase)
# google-services.json
# Freeline
freeline.py
freeline/
freeline_project_description.json
# fastlane
fastlane/report.xml
fastlane/Preview.html
fastlane/screenshots
fastlane/test_output
fastlane/readme.md
# Version control
vcs.xml
# lint
lint/intermediates/
lint/generated/
lint/outputs/
lint/tmp/
# lint/reports/

3
.idea/.gitignore vendored Normal file
View File

@ -0,0 +1,3 @@
# Default ignored files
/shelf/
/workspace.xml

6
.idea/compiler.xml Normal file
View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="CompilerConfiguration">
<bytecodeTargetLevel target="11" />
</component>
</project>

30
.idea/jarRepositories.xml Normal file
View File

@ -0,0 +1,30 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="RemoteRepositoriesConfiguration">
<remote-repository>
<option name="id" value="central" />
<option name="name" value="Maven Central repository" />
<option name="url" value="https://repo1.maven.org/maven2" />
</remote-repository>
<remote-repository>
<option name="id" value="jboss.community" />
<option name="name" value="JBoss Community repository" />
<option name="url" value="https://repository.jboss.org/nexus/content/repositories/public/" />
</remote-repository>
<remote-repository>
<option name="id" value="MavenRepo" />
<option name="name" value="MavenRepo" />
<option name="url" value="https://repo.maven.apache.org/maven2/" />
</remote-repository>
<remote-repository>
<option name="id" value="BintrayJCenter" />
<option name="name" value="BintrayJCenter" />
<option name="url" value="https://jcenter.bintray.com/" />
</remote-repository>
<remote-repository>
<option name="id" value="Google" />
<option name="name" value="Google" />
<option name="url" value="https://dl.google.com/dl/android/maven2/" />
</remote-repository>
</component>
</project>

9
.idea/misc.xml Normal file
View File

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="UTF-8"?>
<project version="4">
<component name="ProjectRootManager" version="2" languageLevel="JDK_11" default="true" project-jdk-name="Android Studio default JDK" project-jdk-type="JavaSDK">
<output url="file://$PROJECT_DIR$/build/classes" />
</component>
<component name="ProjectType">
<option name="id" value="Android" />
</component>
</project>

Binary file not shown.

View File

@ -0,0 +1,25 @@
import java.text.SimpleDateFormat
//
// build.gradle in FtcRobotController
//
apply plugin: 'com.android.library'
android {
defaultConfig {
minSdkVersion 23
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
buildConfigField "String", "APP_BUILD_TIME", '"' + (new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SSSZ", Locale.ROOT).format(new Date())) + '"'
}
compileSdkVersion 29
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_7
targetCompatibility JavaVersion.VERSION_1_7
}
}
apply from: '../build.dependencies.gradle'

View File

@ -0,0 +1,81 @@
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
package="com.qualcomm.ftcrobotcontroller"
android:versionCode="42"
android:versionName="7.0">
<uses-permission android:name="android.permission.RECEIVE_BOOT_COMPLETED" />
<application
android:allowBackup="true"
android:largeHeap="true"
android:extractNativeLibs="true"
android:icon="@drawable/ic_launcher"
android:label="@string/app_name"
android:theme="@style/AppThemeRedRC"
android:usesCleartextTraffic="true">
<!-- Indicates to the ControlHubUpdater what the latest version of the Control Hub is that this app supports -->
<meta-data
android:name="org.firstinspires.latestSupportedControlHubVersion"
android:value="1" />
<!-- The main robot controller activity -->
<activity android:name="org.firstinspires.ftc.robotcontroller.internal.PermissionValidatorWrapper"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<category android:name="android.intent.category.LAUNCHER" />
<action android:name="android.intent.action.MAIN" />
</intent-filter>
</activity>
<activity
android:name="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:screenOrientation="fullUser"
android:configChanges="orientation|screenSize"
android:label="@string/app_name"
android:launchMode="singleTask" >
<intent-filter>
<action android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED" />
</intent-filter>
<meta-data
android:name="android.hardware.usb.action.USB_DEVICE_ATTACHED"
android:resource="@xml/device_filter" />
<!--org.firstinspires.main.entry indicates that this app is compatible with the Dragonboard Control Hub-->
<meta-data
android:name="org.firstinspires.main.entry"
android:value="true" />
</activity>
<!-- The robot controller service in which most of the robot functionality is managed -->
<service
android:name="com.qualcomm.ftccommon.FtcRobotControllerService"
android:enabled="true" />
<!-- Assistant that autostarts the robot controller on android boot (if it's supposed to) -->
<receiver
android:enabled="true"
android:exported="true"
android:name="org.firstinspires.ftc.ftccommon.internal.RunOnBoot"
android:permission="android.permission.RECEIVE_BOOT_COMPLETED">
<intent-filter>
<category android:name="android.intent.category.DEFAULT" />
<action android:name="android.intent.action.BOOT_COMPLETED" />
<action android:name="android.intent.action.QUICKBOOT_POWERON" />
</intent-filter>
</receiver>
</application>
</manifest>

Binary file not shown.

View File

@ -0,0 +1,9 @@
<?xml version="1.0" encoding="UTF-8"?>
<QCARConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="qcar_config.xsd">
<Tracking>
<ImageTarget name="Wheels" size="254.000000 184.154922" />
<ImageTarget name="Tools" size="254.000000 184.154922" />
<ImageTarget name="Legos" size="254.000000 184.154922" />
<ImageTarget name="Gears" size="254.000000 184.154922" />
</Tracking>
</QCARConfig>

Binary file not shown.

View File

@ -0,0 +1,6 @@
<?xml version='1.0' encoding='UTF-8'?>
<QCARConfig>
<Tracking>
<VuMark name="RelicRecovery" size="304.80000376701355 223.630235354" />
</Tracking>
</QCARConfig>

Binary file not shown.

View File

@ -0,0 +1,7 @@
<?xml version="1.0"?>
<QCARConfig xsi:noNamespaceSchemaLocation="qcar_config.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Tracking>
<ImageTarget name="stones" size="247 173"/>
<ImageTarget name="chips" size="247 173"/>
</Tracking>
</QCARConfig>

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,10 @@
<?xml version="1.0" encoding="UTF-8"?>
<QCARConfig xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="qcar_config.xsd">
<Tracking>
<ImageTarget name="BlueTowerGoal" size="257.299988 171.533325" />
<ImageTarget name="RedTowerGoal" size="257.299988 171.533325" />
<ImageTarget name="RedAlliance" size="242.600006 171.430405" />
<ImageTarget name="BlueAlliance" size="252.500000 171.466522" />
<ImageTarget name="FrontWall" size="177.800003 177.800003" />
</Tracking>
</QCARConfig>

View File

@ -0,0 +1,139 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
* This file contains an example of an iterative (Non-Linear) "OpMode".
* An OpMode is a 'program' that runs in either the autonomous or the teleop period of an FTC match.
* The names of OpModes appear on the menu of the FTC Driver Station.
* When an selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all iterative OpModes contain.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Iterative OpMode", group="Iterative Opmode")
@Disabled
public class BasicOpMode_Iterative extends OpMode
{
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
telemetry.addData("Status", "Initialized");
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// Most robots need the motor on one side to be reversed to drive forward
// Reverse the motor that runs backwards when connected directly to the battery
leftDrive.setDirection(DcMotor.Direction.FORWARD);
rightDrive.setDirection(DcMotor.Direction.REVERSE);
// Tell the driver that initialization is complete.
telemetry.addData("Status", "Initialized");
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
runtime.reset();
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,114 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
* This file contains an minimal example of a Linear "OpMode". An OpMode is a 'program' that runs in either
* the autonomous or the teleop period of an FTC match. The names of OpModes appear on the menu
* of the FTC Driver Station. When an selection is made from the menu, the corresponding OpMode
* class is instantiated on the Robot Controller and executed.
*
* This particular OpMode just executes a basic Tank Drive Teleop for a two wheeled robot
* It includes all the skeletal structure that all linear OpModes contain.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Basic: Linear OpMode", group="Linear Opmode")
@Disabled
public class BasicOpMode_Linear extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// Most robots need the motor on one side to be reversed to drive forward
// Reverse the motor that runs backwards when connected directly to the battery
leftDrive.setDirection(DcMotor.Direction.FORWARD);
rightDrive.setDirection(DcMotor.Direction.REVERSE);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,121 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.CompassSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of calibrating a MR Compass
* It uses the common Pushbot hardware class to define the drive on the robot.
* The code is structured as a LinearOpMode
*
* This code assumes there is a compass configured with the name "compass"
*
* This code will put the compass into calibration mode, wait three seconds and then attempt
* to rotate two full turns clockwise. This will allow the compass to do a magnetic calibration.
*
* Once compete, the program will put the compass back into measurement mode and check to see if the
* calibration was successful.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Concept: Compass Calibration", group="Concept")
@Disabled
public class ConceptCompassCalibration extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
private ElapsedTime runtime = new ElapsedTime();
CompassSensor compass;
final static double MOTOR_POWER = 0.2; // scale from 0 to 1
static final long HOLD_TIME_MS = 3000;
static final double CAL_TIME_SEC = 20;
@Override
public void runOpMode() {
/* Initialize the drive system variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// get a reference to our Compass Sensor object.
compass = hardwareMap.get(CompassSensor.class, "compass");
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to cal"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Set the compass to calibration mode
compass.setMode(CompassSensor.CompassMode.CALIBRATION_MODE);
telemetry.addData("Compass", "Compass in calibration mode");
telemetry.update();
sleep(HOLD_TIME_MS); // Just do a sleep while we switch modes
// Start the robot rotating clockwise
telemetry.addData("Compass", "Calibration mode. Turning the robot...");
telemetry.update();
robot.leftDrive.setPower(MOTOR_POWER);
robot.rightDrive.setPower(-MOTOR_POWER);
// run until time expires OR the driver presses STOP;
runtime.reset();
while (opModeIsActive() && (runtime.time() < CAL_TIME_SEC)) {
idle();
}
// Stop all motors and turn off claibration
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
compass.setMode(CompassSensor.CompassMode.MEASUREMENT_MODE);
telemetry.addData("Compass", "Returning to measurement mode");
telemetry.update();
sleep(HOLD_TIME_MS); // Just do a sleep while we switch modes
// Report whether the Calibration was successful or not.
if (compass.calibrationFailed())
telemetry.addData("Compass", "Calibrate Failed. Try Again!");
else
telemetry.addData("Compass", "Calibrate Passed.");
telemetry.update();
}
}

View File

@ -0,0 +1,103 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DeviceInterfaceModule;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This OpMode illustrates using the Device Interface Module as a signalling device.
* The code is structured as a LinearOpMode
*
* This code assumes a DIM name "dim".
*
* There are many examples where the robot might like to signal the driver, without requiring them
* to look at the driver station. This might be something like a "ball in hopper" condition or a
* "ready to shoot" condition.
*
* The DIM has two user settable indicator LEDs (one red one blue). These can be controlled
* directly from your program.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: DIM As Indicator", group = "Concept")
@Disabled
public class ConceptDIMAsIndicator extends LinearOpMode {
static final int BLUE_LED = 0; // Blue LED channel on DIM
static final int RED_LED = 1; // Red LED Channel on DIM
// Create timer to toggle LEDs
private ElapsedTime runtime = new ElapsedTime();
// Define class members
DeviceInterfaceModule dim;
@Override
public void runOpMode() {
// Connect to motor (Assume standard left wheel)
// Change the text in quotes to match any motor name on your robot.
dim = hardwareMap.get(DeviceInterfaceModule.class, "dim");
// Toggle LEDs while Waiting for the start button
telemetry.addData(">", "Press Play to test LEDs." );
telemetry.update();
while (!isStarted()) {
// Determine if we are on an odd or even second
boolean even = (((int)(runtime.time()) & 0x01) == 0);
dim.setLED(RED_LED, even); // Red for even
dim.setLED(BLUE_LED, !even); // Blue for odd
idle();
}
// Running now
telemetry.addData(">", "Press X for Blue, B for Red." );
telemetry.update();
// Now just use red and blue buttons to set red and blue LEDs
while(opModeIsActive()){
dim.setLED(BLUE_LED, gamepad1.x);
dim.setLED(RED_LED, gamepad1.b);
idle();
}
// Turn off LEDs;
dim.setLED(BLUE_LED, false);
dim.setLED(RED_LED, false);
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,200 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gamepad;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This sample illustrates using the rumble feature of many gamepads.
*
* Note: Some gamepads "rumble" better than others.
* The Xbox & PS4 have a left (rumble1) and right (rumble2) rumble motor.
* These two gamepads have two distinct rumble modes: Large on the left, and small on the right
* The ETpark gamepad may only respond to rumble1, and may only run at full power.
* The Logitech F310 gamepad does not have *any* rumble ability.
*
* Moral: You should use this sample to experiment with your specific gamepads to explore their rumble features.
*
* The rumble motors are accessed through the standard gamepad1 and gamepad2 objects.
* Several new methods were added to the Gamepad class in FTC SDK Rev 7
* The key methods are as follows:
*
* .rumble(double rumble1, double rumble2, int durationMs)
* This method sets the rumble power of both motors for a specific time duration.
* Both rumble arguments are motor-power levels in the 0.0 to 1.0 range.
* durationMs is the desired length of the rumble action in milliseconds.
* This method returns immediately.
* Note:
* Use a durationMs of Gamepad.RUMBLE_DURATION_CONTINUOUS to provide a continuous rumble
* Use a power of 0, or duration of 0 to stop a rumble.
*
* .rumbleBlips(int count) allows an easy way to signal the driver with a series of rumble blips.
* Just specify how many blips you want.
* This method returns immediately.
*
* .runRumbleEffect(customRumbleEffect) allows you to run a custom rumble sequence that you have
* built using the Gamepad.RumbleEffect.Builder()
* A "custom effect" is a sequence of steps, where each step can rumble any of the
* rumble motors for a specific period at a specific power level.
* The Custom Effect will play as the (un-blocked) OpMode continues to run
*
* .isRumbling() returns true if there is a rumble effect in progress.
* Use this call to prevent stepping on a current rumble.
*
* .stopRumble() Stop any ongoing rumble or custom rumble effect.
*
* .rumble(int durationMs) Full power rumble for fixed duration.
*
* Note: Whenever a new Rumble command is issued, any currently executing rumble action will
* be truncated, and the new action started immediately. Take these precautions:
* 1) Do Not SPAM the rumble motors by issuing rapid fire commands
* 2) Multiple sources for rumble commands must coordinate to avoid tromping on each other.
*
* This can be achieved several possible ways:
* 1) Only having one source for rumble actions
* 2) Issuing rumble commands on transitions, rather than states.
* e.g. The moment a touch sensor is pressed, rather than the entire time it is being pressed.
* 3) Scheduling rumble commands based on timed events. e.g. 10 seconds prior to endgame
* 4) Rumble on non-overlapping mechanical actions. e.g. arm fully-extended or fully-retracted.
* 5) Use isRumbling() to hold off on a new rumble if one is already in progress.
*
* The examples shown here are representstive of how to invoke a gamepad rumble.
* It is assumed that these will be modified to suit the specific robot and team strategy needs.
*
* ######## Read the telemetry display on the Driver Station Screen for instructions. ######
*
* Ex 1) This example shows a) how to create a custom rumble effect, and then b) how to trigger it based
* on game time. One use for this might be to alert the driver that half-time or End-game is approaching.
*
* Ex 2) This example shows tying the rumble power to a changing sensor value.
* In this case it is the Gamepad trigger, but it could be any sensor output scaled to the 0-1 range.
* Since it takes over the rumble motors, it is only performed when the Left Bumper is pressed.
* Note that this approach MUST include a way to turn OFF the rumble when the button is released.
*
* Ex 3) This example shows a simple way to trigger a 3-blip sequence. In this case it is
* triggered by the gamepad A (Cross) button, but it could be any sensor, like a touch or light sensor.
* Note that this code ensures that it only rumbles once when the input goes true.
*
* Ex 4) This example shows how to trigger a single rumble when an input value gets over a certain value.
* In this case it is reading the Right Trigger, but it could be any variable sensor, like a
* range sensor, or position sensor. The code needs to ensure that it is only triggered once, so
* it waits till the sensor drops back below the threshold before it can trigger again.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Rumble", group ="Concept")
public class ConceptGamepadRumble extends LinearOpMode
{
boolean lastA = false; // Use to track the prior button state.
boolean lastLB = false; // Use to track the prior button state.
boolean highLevel = false; // used to prevent multiple level-based rumbles.
boolean secondHalf = false; // Use to prevent multiple half-time warning rumbles.
Gamepad.RumbleEffect customRumbleEffect; // Use to build a custom rumble sequence.
ElapsedTime runtime = new ElapsedTime(); // Use to determine when end game is starting.
final double HALF_TIME = 60.0; // Wait this many seconds before rumble-alert for half-time.
final double TRIGGER_THRESHOLD = 0.75; // Squeeze more than 3/4 to get rumble.
@Override
public void runOpMode()
{
// Example 1. a) start by creating a three-pulse rumble sequence: right, LEFT, LEFT
customRumbleEffect = new Gamepad.RumbleEffect.Builder()
.addStep(0.0, 1.0, 500) // Rumble right motor 100% for 500 mSec
.addStep(0.0, 0.0, 300) // Pause for 300 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.addStep(0.0, 0.0, 250) // Pause for 250 mSec
.addStep(1.0, 0.0, 250) // Rumble left motor 100% for 250 mSec
.build();
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
runtime.reset(); // Start game timer.
// Loop while monitoring buttons for rumble triggers
while (opModeIsActive())
{
// Read and save the current gamepad button states.
boolean currentA = gamepad1.a ;
boolean currentLB = gamepad1.left_bumper ;
// Display the current Rumble status. Just for interest.
telemetry.addData(">", "Are we RUMBLING? %s\n", gamepad1.isRumbling() ? "YES" : "no" );
// ----------------------------------------------------------------------------------------
// Example 1. b) Watch the runtime timer, and run the custom rumble when we hit half-time.
// Make sure we only signal once by setting "secondHalf" flag to prevent further rumbles.
// ----------------------------------------------------------------------------------------
if ((runtime.seconds() > HALF_TIME) && !secondHalf) {
gamepad1.runRumbleEffect(customRumbleEffect);
secondHalf =true;
}
// Display the time remaining while we are still counting down.
if (!secondHalf) {
telemetry.addData(">", "Halftime Alert Countdown: %3.0f Sec \n", (HALF_TIME - runtime.seconds()) );
}
// ----------------------------------------------------------------------------------------
// Example 2. If Left Bumper is being pressed, power the rumble motors based on the two trigger depressions.
// This is useful to see how the rumble feels at various power levels.
// ----------------------------------------------------------------------------------------
if (currentLB) {
// Left Bumper is being pressed, so send left and right "trigger" values to left and right rumble motors.
gamepad1.rumble(gamepad1.left_trigger, gamepad1.right_trigger, Gamepad.RUMBLE_DURATION_CONTINUOUS);
// Show what is being sent to rumbles
telemetry.addData(">", "Squeeze triggers to control rumbles");
telemetry.addData("> : Rumble", "Left: %.0f%% Right: %.0f%%", gamepad1.left_trigger * 100, gamepad1.right_trigger * 100);
} else {
// Make sure rumble is turned off when Left Bumper is released (only one time each press)
if (lastLB) {
gamepad1.stopRumble();
}
// Prompt for manual rumble action
telemetry.addData(">", "Hold Left-Bumper to test Manual Rumble");
telemetry.addData(">", "Press A (Cross) for three blips");
telemetry.addData(">", "Squeeze right trigger slowly for 1 blip");
}
lastLB = currentLB; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 3. Blip 3 times at the moment that A (Cross) is pressed. (look for pressed transition)
// BUT !!! Skip it altogether if the Gamepad is already rumbling.
// ----------------------------------------------------------------------------------------
if (currentA && !lastA) {
if (!gamepad1.isRumbling()) // Check for possible overlap of rumbles.
gamepad1.rumbleBlips(3);
}
lastA = currentA; // remember the current button state for next time around the loop
// ----------------------------------------------------------------------------------------
// Example 4. Rumble once when gamepad right trigger goes above the THRESHOLD.
// ----------------------------------------------------------------------------------------
if (gamepad1.right_trigger > TRIGGER_THRESHOLD) {
if (!highLevel) {
gamepad1.rumble(0.9, 0, 200); // 200 mSec burst on left motor.
highLevel = true; // Hold off any more triggers
}
} else {
highLevel = false; // We can trigger again now.
}
// Send the telemetry data to the Driver Station, and then pause to pace the program.
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,78 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/**
* This sample illustrates using the touchpad feature found on some gamepads.
*
* The Sony PS4 gamepad can detect two distinct touches on the central touchpad.
* Other gamepads with different touchpads may provide mixed results.
*
* The touchpads are accessed through the standard gamepad1 and gamepad2 objects.
* Several new members were added to the Gamepad class in FTC SDK Rev 7
*
* .touchpad_finger_1 returns true if at least one finger is detected.
* .touchpad_finger_1_x finger 1 X coordinate. Valid if touchpad_finger_1 is true
* .touchpad_finger_1_y finger 1 Y coordinate. Valid if touchpad_finger_1 is true
*
* .touchpad_finger_2 returns true if a second finger is detected
* .touchpad_finger_2_x finger 2 X coordinate. Valid if touchpad_finger_2 is true
* .touchpad_finger_2_y finger 2 Y coordinate. Valid if touchpad_finger_2 is true
*
* Finger touches are reported with an X and Y coordinate in following coordinate system.
*
* 1) X is the Horizontal axis, and Y is the vertical axis
* 2) The 0,0 origin is at the center of the touchpad.
* 3) 1.0, 1.0 is at the top right corner of the touchpad.
* 4) -1.0,-1.0 is at the bottom left corner of the touchpad.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this OpMode to the Driver Station OpMode list.
*/
@Disabled
@TeleOp(name="Concept: Gamepad Touchpad", group ="Concept")
public class ConceptGamepadTouchpad extends LinearOpMode
{
@Override
public void runOpMode()
{
telemetry.setDisplayFormat(Telemetry.DisplayFormat.MONOSPACE);
telemetry.addData(">", "Press Start");
telemetry.update();
waitForStart();
while (opModeIsActive())
{
boolean finger = false;
// Display finger 1 x & y position if finger detected
if(gamepad1.touchpad_finger_1)
{
finger = true;
telemetry.addLine(String.format("Finger 1: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_1_x, gamepad1.touchpad_finger_1_y));
}
// Display finger 2 x & y position if finger detected
if(gamepad1.touchpad_finger_2)
{
finger = true;
telemetry.addLine(String.format("Finger 2: x=%5.2f y=%5.2f\n", gamepad1.touchpad_finger_2_x, gamepad1.touchpad_finger_2_y));
}
if(!finger)
{
telemetry.addLine("No fingers");
}
telemetry.update();
sleep(10);
}
}
}

View File

@ -0,0 +1,223 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsUsbDeviceInterfaceModule;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DeviceInterfaceModule;
import com.qualcomm.robotcore.hardware.I2cAddr;
import com.qualcomm.robotcore.util.RobotLog;
import com.qualcomm.robotcore.util.TypeConversion;
import java.util.concurrent.locks.Lock;
/**
* An example of a linear op mode that shows how to change the I2C address.
*/
@TeleOp(name = "Concept: I2c Address Change", group = "Concept")
@Disabled
public class ConceptI2cAddressChange extends LinearOpMode {
public static final int ADDRESS_SET_NEW_I2C_ADDRESS = 0x70;
// trigger bytes used to change I2C address on ModernRobotics sensors.
public static final byte TRIGGER_BYTE_1 = 0x55;
public static final byte TRIGGER_BYTE_2 = (byte) 0xaa;
// Expected bytes from the Modern Robotics IR Seeker V3 memory map
public static final byte IR_SEEKER_V3_FIRMWARE_REV = 0x12;
public static final byte IR_SEEKER_V3_SENSOR_ID = 0x49;
public static final I2cAddr IR_SEEKER_V3_ORIGINAL_ADDRESS = I2cAddr.create8bit(0x38);
// Expected bytes from the Modern Robotics Color Sensor memory map
public static final byte COLOR_SENSOR_FIRMWARE_REV = 0x10;
public static final byte COLOR_SENSOR_SENSOR_ID = 0x43;
public static final byte COLOR_SENSOR_ORIGINAL_ADDRESS = 0x3C;
public static final byte MANUFACTURER_CODE = 0x4d;
// Currently, this is set to expect the bytes from the IR Seeker.
// If you change these values so you're setting "FIRMWARE_REV" to
// COLOR_SENSOR_FIRMWARE_REV, and "SENSOR_ID" to "COLOR_SENSOR_SENSOR_ID",
// you'll be able to change the I2C address of the ModernRoboticsColorSensor.
// If the bytes you're expecting are different than what this op mode finds,
// a comparison will be printed out into the logfile.
public static final byte FIRMWARE_REV = IR_SEEKER_V3_FIRMWARE_REV;
public static final byte SENSOR_ID = IR_SEEKER_V3_SENSOR_ID;
// These byte values are common with most Modern Robotics sensors.
public static final int READ_MODE = 0x80;
public static final int ADDRESS_MEMORY_START = 0x0;
public static final int TOTAL_MEMORY_LENGTH = 0x0c;
public static final int BUFFER_CHANGE_ADDRESS_LENGTH = 0x03;
// The port where your sensor is connected.
int port = 5;
byte[] readCache;
Lock readLock;
byte[] writeCache;
Lock writeLock;
I2cAddr currentAddress = IR_SEEKER_V3_ORIGINAL_ADDRESS;
// I2c addresses on Modern Robotics devices must be divisible by 2, and between 0x7e and 0x10
// Different hardware may have different rules.
// Be sure to read the requirements for the hardware you're using!
// If you use an invalid address, you may make your device completely unusable.
I2cAddr newAddress = I2cAddr.create8bit(0x42);
DeviceInterfaceModule dim;
@Override
public void runOpMode() {
// set up the hardware devices we are going to use
dim = hardwareMap.get(DeviceInterfaceModule.class, "dim");
readCache = dim.getI2cReadCache(port);
readLock = dim.getI2cReadCacheLock(port);
writeCache = dim.getI2cWriteCache(port);
writeLock = dim.getI2cWriteCacheLock(port);
// I2c addresses on Modern Robotics devices must be divisible by 2, and between 0x7e and 0x10
// Different hardware may have different rules.
// Be sure to read the requirements for the hardware you're using!
ModernRoboticsUsbDeviceInterfaceModule.throwIfModernRoboticsI2cAddressIsInvalid(newAddress);
// wait for the start button to be pressed
waitForStart();
performAction("read", port, currentAddress, ADDRESS_MEMORY_START, TOTAL_MEMORY_LENGTH);
while(!dim.isI2cPortReady(port)) {
telemetry.addData("I2cAddressChange", "waiting for the port to be ready...");
telemetry.update();
sleep(1000);
}
// update the local cache
dim.readI2cCacheFromController(port);
// make sure the first bytes are what we think they should be.
int count = 0;
int[] initialArray = {READ_MODE, currentAddress.get8Bit(), ADDRESS_MEMORY_START, TOTAL_MEMORY_LENGTH, FIRMWARE_REV, MANUFACTURER_CODE, SENSOR_ID};
while (!foundExpectedBytes(initialArray, readLock, readCache)) {
telemetry.addData("I2cAddressChange", "Confirming that we're reading the correct bytes...");
telemetry.update();
dim.readI2cCacheFromController(port);
sleep(1000);
count++;
// if we go too long with failure, we probably are expecting the wrong bytes.
if (count >= 10) {
telemetry.addData("I2cAddressChange", String.format("Looping too long with no change, probably have the wrong address. Current address: 8bit=0x%02x", currentAddress.get8Bit()));
hardwareMap.irSeekerSensor.get(String.format("Looping too long with no change, probably have the wrong address. Current address: 8bit=0x%02x", currentAddress.get8Bit()));
telemetry.update();
}
}
// Enable writes to the correct segment of the memory map.
performAction("write", port, currentAddress, ADDRESS_SET_NEW_I2C_ADDRESS, BUFFER_CHANGE_ADDRESS_LENGTH);
// Write out the trigger bytes, and the new desired address.
writeNewAddress();
dim.setI2cPortActionFlag(port);
dim.writeI2cCacheToController(port);
telemetry.addData("I2cAddressChange", "Giving the hardware 60 seconds to make the change...");
telemetry.update();
// Changing the I2C address takes some time.
sleep(60000);
// Query the new address and see if we can get the bytes we expect.
dim.enableI2cReadMode(port, newAddress, ADDRESS_MEMORY_START, TOTAL_MEMORY_LENGTH);
dim.setI2cPortActionFlag(port);
dim.writeI2cCacheToController(port);
int[] confirmArray = {READ_MODE, newAddress.get8Bit(), ADDRESS_MEMORY_START, TOTAL_MEMORY_LENGTH, FIRMWARE_REV, MANUFACTURER_CODE, SENSOR_ID};
while (!foundExpectedBytes(confirmArray, readLock, readCache)) {
telemetry.addData("I2cAddressChange", "Have not confirmed the changes yet...");
telemetry.update();
dim.readI2cCacheFromController(port);
sleep(1000);
}
telemetry.addData("I2cAddressChange", "Successfully changed the I2C address. New address: 8bit=0x%02x", newAddress.get8Bit());
telemetry.update();
RobotLog.i("Successfully changed the I2C address." + String.format("New address: 8bit=0x%02x", newAddress.get8Bit()));
/**** IMPORTANT NOTE ******/
// You need to add a line like this at the top of your op mode
// to update the I2cAddress in the driver.
//irSeeker.setI2cAddress(newAddress);
/***************************/
}
private boolean foundExpectedBytes(int[] byteArray, Lock lock, byte[] cache) {
try {
lock.lock();
boolean allMatch = true;
StringBuilder s = new StringBuilder(300 * 4);
String mismatch = "";
for (int i = 0; i < byteArray.length; i++) {
s.append(String.format("expected: %02x, got: %02x \n", TypeConversion.unsignedByteToInt( (byte) byteArray[i]), cache[i]));
if (TypeConversion.unsignedByteToInt(cache[i]) != TypeConversion.unsignedByteToInt( (byte) byteArray[i])) {
mismatch = String.format("i: %d, byteArray[i]: %02x, cache[i]: %02x", i, byteArray[i], cache[i]);
allMatch = false;
}
}
RobotLog.e(s.toString() + "\n allMatch: " + allMatch + ", mismatch: " + mismatch);
return allMatch;
} finally {
lock.unlock();
}
}
private void performAction(String actionName, int port, I2cAddr i2cAddress, int memAddress, int memLength) {
if (actionName.equalsIgnoreCase("read")) dim.enableI2cReadMode(port, i2cAddress, memAddress, memLength);
if (actionName.equalsIgnoreCase("write")) dim.enableI2cWriteMode(port, i2cAddress, memAddress, memLength);
dim.setI2cPortActionFlag(port);
dim.writeI2cCacheToController(port);
dim.readI2cCacheFromController(port);
}
private void writeNewAddress() {
try {
writeLock.lock();
writeCache[4] = (byte) newAddress.get8Bit();
writeCache[5] = TRIGGER_BYTE_1;
writeCache[6] = TRIGGER_BYTE_2;
} finally {
writeLock.unlock();
}
}
}

View File

@ -0,0 +1,222 @@
/* Copyright (c) 2019 Phil Malone. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.lynx.LynxModule;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorEx;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.util.Iterator;
import java.util.List;
/*
This sample illustrates how to use the Expansion Hub's Bulk-Read feature to speed up control cycle times.
In this example there are 4 motors that need their encoder positions, and velocities read.
The sample is written to work with one or two expansion hubs, with no assumption as to where the motors are located.
Three scenarios are tested:
Cache Mode = OFF This is the normal default, where no cache is used, and every read produces a discrete transaction with
an expansion hub, which is the slowest approach.
Cache Mode = AUTO This mode will attempt to minimize the number of discrete read commands, by performing bulk-reads
and then returning values that have been cached. The cache is updated automatically whenever a specific read operation is repeated.
This mode will always return fresh data, but it may perform more bulk-reads than absolutely required.
Extra reads will be performed if multiple identical encoder/velocity reads are performed in one control cycle.
This mode is a good compromise between the OFF and MANUAL modes.
Cache Mode = MANUAL This mode enables the user's code to determine the best time to refresh the cached bulk-read data.
Well organized code can place all the sensor reads in one location, and then just reset the cache once per control cycle.
The approach will produce the shortest cycle times, but it does require the user to manually clear the cache.
-------------------------------------
General tip to speed up your control cycles:
No matter what method you use to read encoders and other inputs, you should try to
avoid reading the same input multiple times around a control loop.
Under normal conditions, this will slow down the control loop.
The preferred method is to read all the required inputs ONCE at the beginning of the loop,
and save the values in variable that can be used by other parts of the control code.
eg: if you are sending encoder positions to your telemetry display, putting a getCurrentPosition()
call in the telemetry statement will force the code to go and get another copy which will take time.
It's much better read the position into a variable once, and use that variable for control AND display.
Reading saved variables takes no time at all.
Once you put all your sensor reads at the beginning of the control cycle, it's very easy to use
the bulk-read AUTO mode to streamline your cycle timing.
*/
@TeleOp (name = "Motor Bulk Reads", group = "Tests")
@Disabled
public class ConceptMotorBulkRead extends LinearOpMode {
final int TEST_CYCLES = 500; // Number of control cycles to run to determine cycle times.
private DcMotorEx m1, m2, m3, m4; // Motor Objects
private long e1, e2, e3, e4; // Encoder Values
private double v1, v2, v3, v4; // Velocities
// Cycle Times
double t1 = 0;
double t2 = 0;
double t3 = 0;
@Override
public void runOpMode() {
int cycles;
// Important Step 1: Make sure you use DcMotorEx when you instantiate your motors.
m1 = hardwareMap.get(DcMotorEx.class, "m1"); // Configure the robot to use these 4 motor names,
m2 = hardwareMap.get(DcMotorEx.class, "m2"); // or change these strings to match your existing Robot Configuration.
m3 = hardwareMap.get(DcMotorEx.class, "m3");
m4 = hardwareMap.get(DcMotorEx.class, "m4");
// Important Step 2: Get access to a list of Expansion Hub Modules to enable changing caching methods.
List<LynxModule> allHubs = hardwareMap.getAll(LynxModule.class);
ElapsedTime timer = new ElapsedTime();
telemetry.addData(">", "Press play to start tests");
telemetry.addData(">", "Test results will update for each access method.");
telemetry.update();
waitForStart();
// --------------------------------------------------------------------------------------
// Run control loop using legacy encoder reads
// In this mode, a single read is done for each encoder position, and a bulk read is done for each velocity read.
// This is the worst case scenario.
// This is the same as using LynxModule.BulkCachingMode.OFF
// --------------------------------------------------------------------------------------
displayCycleTimes("Test 1 of 3 (Wait for completion)");
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition();
e2 = m2.getCurrentPosition();
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t1 = timer.milliseconds() / cycles;
displayCycleTimes("Test 2 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using AUTO cache mode
// In this mode, only one bulk read is done per cycle, UNLESS you read a specific encoder/velocity item AGAIN in that cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option A. Set all Expansion hubs to use the AUTO Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.AUTO);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read for all 4 encoder/velocity reads,
e2 = m2.getCurrentPosition(); // but don't do any `get` operations more than once per cycle.
e3 = m3.getCurrentPosition();
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t2 = timer.milliseconds() / cycles;
displayCycleTimes("Test 3 of 3 (Wait for completion)");
// --------------------------------------------------------------------------------------
// Run test cycles using MANUAL cache mode
// In this mode, only one block read is done each control cycle.
// This is the MOST efficient method, but it does require that the cache is cleared manually each control cycle.
// --------------------------------------------------------------------------------------
// Important Step 3: Option B. Set all Expansion hubs to use the MANUAL Bulk Caching mode
for (LynxModule module : allHubs) {
module.setBulkCachingMode(LynxModule.BulkCachingMode.MANUAL);
}
timer.reset();
cycles = 0;
while (opModeIsActive() && (cycles++ < TEST_CYCLES)) {
// Important Step 4: If you are using MANUAL mode, you must clear the BulkCache once per control cycle
for (LynxModule module : allHubs) {
module.clearBulkCache();
}
e1 = m1.getCurrentPosition(); // Uses 1 bulk-read to obtain ALL the motor data
e2 = m2.getCurrentPosition(); // There is no penalty for doing more `get` operations in this cycle,
e3 = m3.getCurrentPosition(); // but they will return the same data.
e4 = m4.getCurrentPosition();
v1 = m1.getVelocity();
v2 = m2.getVelocity();
v3 = m3.getVelocity();
v4 = m4.getVelocity();
// Put Control loop action code here.
}
// calculate the average cycle time.
t3 = timer.milliseconds() / cycles;
displayCycleTimes("Complete");
// wait until op-mode is stopped by user, before clearing display.
while (opModeIsActive()) ;
}
// Display three comparison times.
void displayCycleTimes(String status) {
telemetry.addData("Testing", status);
telemetry.addData("Cache = OFF", "%5.1f mS/cycle", t1);
telemetry.addData("Cache = AUTO", "%5.1f mS/cycle", t2);
telemetry.addData("Cache = MANUAL", "%5.1f mS/cycle", t3);
telemetry.update();
}
}

View File

@ -0,0 +1,79 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ElapsedTime;
import java.text.SimpleDateFormat;
import java.util.Date;
/**
* Demonstrates empty OpMode
*/
@TeleOp(name = "Concept: NullOp", group = "Concept")
@Disabled
public class ConceptNullOp extends OpMode {
private ElapsedTime runtime = new ElapsedTime();
@Override
public void init() {
telemetry.addData("Status", "Initialized");
}
/*
* Code to run when the op mode is first enabled goes here
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#start()
*/
@Override
public void init_loop() {
}
/*
* This method will be called ONCE when start is pressed
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#loop()
*/
@Override
public void start() {
runtime.reset();
}
/*
* This method will be called repeatedly in a loop
* @see com.qualcomm.robotcore.eventloop.opmode.OpMode#loop()
*/
@Override
public void loop() {
telemetry.addData("Status", "Run Time: " + runtime.toString());
}
}

View File

@ -0,0 +1,114 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
/**
* This OpMode ramps a single motor speed up and down repeatedly until Stop is pressed.
* The code is structured as a LinearOpMode
*
* This code assumes a DC motor configured with the name "left_drive" as is found on a pushbot.
*
* INCREMENT sets how much to increase/decrease the power each cycle
* CYCLE_MS sets the update period.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Ramp Motor Speed", group = "Concept")
@Disabled
public class ConceptRampMotorSpeed extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to ramp motor each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_FWD = 1.0; // Maximum FWD power applied to motor
static final double MAX_REV = -1.0; // Maximum REV power applied to motor
// Define class members
DcMotor motor;
double power = 0;
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to motor (Assume standard left wheel)
// Change the text in quotes to match any motor name on your robot.
motor = hardwareMap.get(DcMotor.class, "left_drive");
// Wait for the start button
telemetry.addData(">", "Press Start to run Motors." );
telemetry.update();
waitForStart();
// Ramp motor speeds till stop pressed.
while(opModeIsActive()) {
// Ramp the motors, according to the rampUp variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
power += INCREMENT ;
if (power >= MAX_FWD ) {
power = MAX_FWD;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
power -= INCREMENT ;
if (power <= MAX_REV ) {
power = MAX_REV;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Motor Power", "%5.2f", power);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the motor to the new power and pause;
motor.setPower(power);
sleep(CYCLE_MS);
idle();
}
// Turn off motor and signal done;
motor.setPower(0);
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,112 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotorSimple;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
*
* This OpMode executes a basic Tank Drive Teleop for a two wheeled robot using two REV SPARKminis.
* To use this example, connect two REV SPARKminis into servo ports on the Expansion Hub. On the
* robot configuration, use the drop down list under 'Servos' to select 'REV SPARKmini Controller'
* and name them 'left_drive' and 'right_drive'.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="REV SPARKmini Simple Drive Example", group="Concept")
@Disabled
public class ConceptRevSPARKMini extends LinearOpMode {
// Declare OpMode members.
private ElapsedTime runtime = new ElapsedTime();
private DcMotorSimple leftDrive = null;
private DcMotorSimple rightDrive = null;
@Override
public void runOpMode() {
telemetry.addData("Status", "Initialized");
telemetry.update();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotorSimple.class, "left_drive");
rightDrive = hardwareMap.get(DcMotorSimple.class, "right_drive");
// Most robots need the motor on one side to be reversed to drive forward
// Reverse the motor that runs backwards when connected directly to the battery
leftDrive.setDirection(DcMotorSimple.Direction.FORWARD);
rightDrive.setDirection(DcMotorSimple.Direction.REVERSE);
// Wait for the game to start (driver presses PLAY)
waitForStart();
runtime.reset();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Setup a variable for each drive wheel to save power level for telemetry
double leftPower;
double rightPower;
// Choose to drive using either Tank Mode, or POV Mode
// Comment out the method that's not used. The default below is POV.
// POV Mode uses left stick to go forward, and right stick to turn.
// - This uses basic math to combine motions and is easier to drive straight.
double drive = -gamepad1.left_stick_y;
double turn = gamepad1.right_stick_x;
leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
// Tank Mode uses one stick to control each wheel.
// - This requires no math, but it is hard to drive forward slowly and keep straight.
// leftPower = -gamepad1.left_stick_y ;
// rightPower = -gamepad1.right_stick_y ;
// Send calculated power to wheels
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
// Show the elapsed game time and wheel power.
telemetry.addData("Status", "Run Time: " + runtime.toString());
telemetry.addData("Motors", "left (%.2f), right (%.2f)", leftPower, rightPower);
telemetry.update();
}
}
}

View File

@ -0,0 +1,115 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Servo;
/**
* This OpMode scans a single servo back and forwards until Stop is pressed.
* The code is structured as a LinearOpMode
* INCREMENT sets how much to increase/decrease the servo position each cycle
* CYCLE_MS sets the update period.
*
* This code assumes a Servo configured with the name "left_hand" as is found on a pushbot.
*
* NOTE: When any servo position is set, ALL attached servos are activated, so ensure that any other
* connected servos are able to move freely before running this test.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Concept: Scan Servo", group = "Concept")
@Disabled
public class ConceptScanServo extends LinearOpMode {
static final double INCREMENT = 0.01; // amount to slew servo each CYCLE_MS cycle
static final int CYCLE_MS = 50; // period of each cycle
static final double MAX_POS = 1.0; // Maximum rotational position
static final double MIN_POS = 0.0; // Minimum rotational position
// Define class members
Servo servo;
double position = (MAX_POS - MIN_POS) / 2; // Start at halfway position
boolean rampUp = true;
@Override
public void runOpMode() {
// Connect to servo (Assume PushBot Left Hand)
// Change the text in quotes to match any servo name on your robot.
servo = hardwareMap.get(Servo.class, "left_hand");
// Wait for the start button
telemetry.addData(">", "Press Start to scan Servo." );
telemetry.update();
waitForStart();
// Scan servo till stop pressed.
while(opModeIsActive()){
// slew the servo, according to the rampUp (direction) variable.
if (rampUp) {
// Keep stepping up until we hit the max value.
position += INCREMENT ;
if (position >= MAX_POS ) {
position = MAX_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
else {
// Keep stepping down until we hit the min value.
position -= INCREMENT ;
if (position <= MIN_POS ) {
position = MIN_POS;
rampUp = !rampUp; // Switch ramp direction
}
}
// Display the current value
telemetry.addData("Servo Position", "%5.2f", position);
telemetry.addData(">", "Press Stop to end test." );
telemetry.update();
// Set the servo to the new position and pause;
servo.setPosition(position);
sleep(CYCLE_MS);
idle();
}
// Signal done;
telemetry.addData(">", "Done");
telemetry.update();
}
}

View File

@ -0,0 +1,135 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/**
* This file demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to build sounds into your application as a resource.
* This technique is best suited for use with Android Studio since it assumes you will be creating a new application
*
* If you are using OnBotJava, please see the ConceptSoundsOnBotJava sample
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* For sound files to be used as a compiled-in resource, they need to be located in a folder called "raw" under your "res" (resources) folder.
* You can create your own "raw" folder from scratch, or you can copy the one from the FtcRobotController module.
*
* Android Studio coders will ultimately need a folder in your path as follows:
* <project root>/TeamCode/src/main/res/raw
*
* Copy any .wav files you want to play into this folder.
* Make sure that your files ONLY use lower-case characters, and have no spaces or special characters other than underscore.
*
* The name you give your .wav files will become the resource ID for these sounds.
* eg: gold.wav becomes R.raw.gold
*
* If you wish to use the sounds provided for this sample, they are located in:
* <project root>/FtcRobotController/src/main/res/raw
* You can copy and paste the entire 'raw' folder using Android Studio.
*
*/
@TeleOp(name="Concept: Sound Resources", group="Concept")
@Disabled
public class ConceptSoundsASJava extends LinearOpMode {
// Declare OpMode members.
private boolean goldFound; // Sound file present flags
private boolean silverFound;
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Determine Resource IDs for sounds built into the RC application.
int silverSoundID = hardwareMap.appContext.getResources().getIdentifier("silver", "raw", hardwareMap.appContext.getPackageName());
int goldSoundID = hardwareMap.appContext.getResources().getIdentifier("gold", "raw", hardwareMap.appContext.getPackageName());
// Determine if sound resources are found.
// Note: Preloading is NOT required, but it's a good way to verify all your sounds are available before you run.
if (goldSoundID != 0)
goldFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, goldSoundID);
if (silverSoundID != 0)
silverFound = SoundPlayer.getInstance().preload(hardwareMap.appContext, silverSoundID);
// Display sound status
telemetry.addData("gold resource", goldFound ? "Found" : "NOT found\n Add gold.wav to /src/main/res/raw" );
telemetry.addData("silver resource", silverFound ? "Found" : "Not found\n Add silver.wav to /src/main/res/raw" );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X, B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverSoundID);
telemetry.addData("Playing", "Resource Silver");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldSoundID);
telemetry.addData("Playing", "Resource Gold");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,119 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.io.File;
/**
* This file demonstrates how to play simple sounds on both the RC and DS phones.
* It illustrates how to play sound files that have been copied to the RC Phone
* This technique is best suited for use with OnBotJava since it does not require the app to be modified.
*
* Operation:
*
* Gamepad X & B buttons are used to trigger sounds in this example, but any event can be used.
* Note: Time should be allowed for sounds to complete before playing other sounds.
*
* To play a new sound, you will need to copy the .wav files to the phone, and then provide the full path to them as part of your OpMode.
* This is done in this sample for the two sound files. silver.wav and gold.wav
*
* You can put the files in a variety of soundPaths, but we recommend you put them in the /FIRST/blocks/sounds folder.
* Your OpModes will have guaranteed access to this folder, and you can transfer files into this folder using the BLOCKS web page.
* -- There is a link called "sounds" on the right hand side of the color bar on the BLOCKS page that can be used to send sound files to this folder by default.
* Or you can use Windows File Manager, or ADB to transfer the sound files
*
* To get full use of THIS sample, you will need to copy two sound file called silver.wav and gold.wav to /FIRST/blocks/sounds on the RC phone.
* They can be located here:
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/gold.wav
* https://github.com/ftctechnh/ftc_app/tree/master/FtcRobotController/src/main/res/raw/silver.wav
*/
@TeleOp(name="Concept: Sound Files", group="Concept")
@Disabled
public class ConceptSoundsOnBotJava extends LinearOpMode {
// Point to sound files on the phone's drive
private String soundPath = "/FIRST/blocks/sounds";
private File goldFile = new File("/sdcard" + soundPath + "/gold.wav");
private File silverFile = new File("/sdcard" + soundPath + "/silver.wav");
// Declare OpMode members.
private boolean isX = false; // Gamepad button state variables
private boolean isB = false;
private boolean wasX = false; // Gamepad button history variables
private boolean WasB = false;
@Override
public void runOpMode() {
// Make sure that the sound files exist on the phone
boolean goldFound = goldFile.exists();
boolean silverFound = silverFile.exists();
// Display sound status
telemetry.addData("gold sound", goldFound ? "Found" : "NOT Found \nCopy gold.wav to " + soundPath );
telemetry.addData("silver sound", silverFound ? "Found" : "NOT Found \nCopy silver.wav to " + soundPath );
// Wait for the game to start (driver presses PLAY)
telemetry.addData(">", "Press Start to continue");
telemetry.update();
waitForStart();
telemetry.addData(">", "Press X or B to play sounds.");
telemetry.update();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// say Silver each time gamepad X is pressed (This sound is a resource)
if (silverFound && (isX = gamepad1.x) && !wasX) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, silverFile);
telemetry.addData("Playing", "Silver File");
telemetry.update();
}
// say Gold each time gamepad B is pressed (This sound is a resource)
if (goldFound && (isB = gamepad1.b) && !WasB) {
SoundPlayer.getInstance().startPlaying(hardwareMap.appContext, goldFile);
telemetry.addData("Playing", "Gold File");
telemetry.update();
}
// Save last button states
wasX = isX;
WasB = isB;
}
}
}

View File

@ -0,0 +1,123 @@
/* Copyright (c) 2018 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.content.Context;
import com.qualcomm.ftccommon.SoundPlayer;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
/**
* This file demonstrates how to play one of the several SKYSTONE/Star Wars sounds loaded into the SDK.
* It does this by creating a simple "chooser" controlled by the gamepad Up Down buttons.
* This code also prevents sounds from stacking up by setting a "playing" flag, which is cleared when the sound finishes playing.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* Operation:
* Use the DPAD to change the selected sound, and the Right Bumper to play it.
*/
@TeleOp(name="SKYSTONE Sounds", group="Concept")
@Disabled
public class ConceptSoundsSKYSTONE extends LinearOpMode {
// List of available sound resources
String sounds[] = {"ss_alarm", "ss_bb8_down", "ss_bb8_up", "ss_darth_vader", "ss_fly_by",
"ss_mf_fail", "ss_laser", "ss_laser_burst", "ss_light_saber", "ss_light_saber_long", "ss_light_saber_short",
"ss_light_speed", "ss_mine", "ss_power_up", "ss_r2d2_up", "ss_roger_roger", "ss_siren", "ss_wookie" };
boolean soundPlaying = false;
@Override
public void runOpMode() {
// Variables for choosing from the available sounds
int soundIndex = 0;
int soundID = -1;
boolean was_dpad_up = false;
boolean was_dpad_down = false;
Context myApp = hardwareMap.appContext;
// create a sound parameter that holds the desired player parameters.
SoundPlayer.PlaySoundParams params = new SoundPlayer.PlaySoundParams();
params.loopControl = 0;
params.waitForNonLoopingSoundsToFinish = true;
// In this sample, we will skip waiting for the user to press play, and start displaying sound choices right away
while (!isStopRequested()) {
// Look for DPAD presses to change the selection
if (gamepad1.dpad_down && !was_dpad_down) {
// Go to next sound (with list wrap) and display it
soundIndex = (soundIndex + 1) % sounds.length;
}
if (gamepad1.dpad_up && !was_dpad_up) {
// Go to previous sound (with list wrap) and display it
soundIndex = (soundIndex + sounds.length - 1) % sounds.length;
}
// Look for trigger to see if we should play sound
// Only start a new sound if we are currently not playing one.
if (gamepad1.right_bumper && !soundPlaying) {
// Determine Resource IDs for the sounds you want to play, and make sure it's valid.
if ((soundID = myApp.getResources().getIdentifier(sounds[soundIndex], "raw", myApp.getPackageName())) != 0){
// Signal that the sound is now playing.
soundPlaying = true;
// Start playing, and also Create a callback that will clear the playing flag when the sound is complete.
SoundPlayer.getInstance().startPlaying(myApp, soundID, params, null,
new Runnable() {
public void run() {
soundPlaying = false;
}} );
}
}
// Remember the last state of the dpad to detect changes.
was_dpad_up = gamepad1.dpad_up;
was_dpad_down = gamepad1.dpad_down;
// Display the current sound choice, and the playing status.
telemetry.addData("", "Use DPAD up/down to choose sound.");
telemetry.addData("", "Press Right Bumper to play sound.");
telemetry.addData("", "");
telemetry.addData("Sound >", sounds[soundIndex]);
telemetry.addData("Status >", soundPlaying ? "Playing" : "Stopped");
telemetry.update();
}
}
}

View File

@ -0,0 +1,178 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.VoltageSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.Telemetry;
/**
* {@link ConceptTelemetry} illustrates various ways in which telemetry can be
* transmitted from the robot controller to the driver station. The sample illustrates
* numeric and text data, formatted output, and optimized evaluation of expensive-to-acquire
* information. The telemetry {@link Telemetry#log() log} is illustrated by scrolling a poem
* to the driver station.
*
* @see Telemetry
*/
@TeleOp(name = "Concept: Telemetry", group = "Concept")
@Disabled
public class ConceptTelemetry extends LinearOpMode {
/** keeps track of the line of the poem which is to be emitted next */
int poemLine = 0;
/** keeps track of how long it's been since we last emitted a line of poetry */
ElapsedTime poemElapsed = new ElapsedTime();
static final String[] poem = new String[] {
"Mary had a little lamb,",
"His fleece was white as snow,",
"And everywhere that Mary went,",
"The lamb was sure to go.",
"",
"He followed her to school one day,",
"Which was against the rule,",
"It made the children laugh and play",
"To see a lamb at school.",
"",
"And so the teacher turned it out,",
"But still it lingered near,",
"And waited patiently about,",
"Till Mary did appear.",
"",
"\"Why does the lamb love Mary so?\"",
"The eager children cry.",
"\"Why, Mary loves the lamb, you know,\"",
"The teacher did reply.",
"",
""
};
@Override public void runOpMode() {
/* we keep track of how long it's been since the OpMode was started, just
* to have some interesting data to show */
ElapsedTime opmodeRunTime = new ElapsedTime();
// We show the log in oldest-to-newest order, as that's better for poetry
telemetry.log().setDisplayOrder(Telemetry.Log.DisplayOrder.OLDEST_FIRST);
// We can control the number of lines shown in the log
telemetry.log().setCapacity(6);
// The interval between lines of poetry, in seconds
double sPoemInterval = 0.6;
/**
* Wait until we've been given the ok to go. For something to do, we emit the
* elapsed time as we sit here and wait. If we didn't want to do anything while
* we waited, we would just call {@link #waitForStart()}.
*/
while (!isStarted()) {
telemetry.addData("time", "%.1f seconds", opmodeRunTime.seconds());
telemetry.update();
idle();
}
// Ok, we've been given the ok to go
/**
* As an illustration, the first line on our telemetry display will display the battery voltage.
* The idea here is that it's expensive to compute the voltage (at least for purposes of illustration)
* so you don't want to do it unless the data is <em>actually</em> going to make it to the
* driver station (recall that telemetry transmission is throttled to reduce bandwidth use.
* Note that getBatteryVoltage() below returns 'Infinity' if there's no voltage sensor attached.
*
* @see Telemetry#getMsTransmissionInterval()
*/
telemetry.addData("voltage", "%.1f volts", new Func<Double>() {
@Override public Double value() {
return getBatteryVoltage();
}
});
// Reset to keep some timing stats for the post-'start' part of the opmode
opmodeRunTime.reset();
int loopCount = 1;
// Go go gadget robot!
while (opModeIsActive()) {
// Emit poetry if it's been a while
if (poemElapsed.seconds() > sPoemInterval) {
emitPoemLine();
}
// As an illustration, show some loop timing information
telemetry.addData("loop count", loopCount);
telemetry.addData("ms/loop", "%.3f ms", opmodeRunTime.milliseconds() / loopCount);
// Show joystick information as some other illustrative data
telemetry.addLine("left joystick | ")
.addData("x", gamepad1.left_stick_x)
.addData("y", gamepad1.left_stick_y);
telemetry.addLine("right joystick | ")
.addData("x", gamepad1.right_stick_x)
.addData("y", gamepad1.right_stick_y);
/**
* Transmit the telemetry to the driver station, subject to throttling.
* @see Telemetry#getMsTransmissionInterval()
*/
telemetry.update();
/** Update loop info and play nice with the rest of the {@link Thread}s in the system */
loopCount++;
}
}
// emits a line of poetry to the telemetry log
void emitPoemLine() {
telemetry.log().add(poem[poemLine]);
poemLine = (poemLine+1) % poem.length;
poemElapsed.reset();
}
// Computes the current battery voltage
double getBatteryVoltage() {
double result = Double.POSITIVE_INFINITY;
for (VoltageSensor sensor : hardwareMap.voltageSensor) {
double voltage = sensor.getVoltage();
if (voltage > 0) {
result = Math.min(result, voltage);
}
}
return result;
}
}

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer.CameraDirection;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
/**
* This 2020-2021 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine the position of the Freight Frenzy game elements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetection extends LinearOpMode {
/* Note: This sample uses the all-objects Tensor Flow model (FreightFrenzy_BCDM.tflite), which contains
* the following 4 detectable objects
* 0: Ball,
* 1: Cube,
* 2: Duck,
* 3: Marker (duck location tape marker)
*
* Two additional model assets are available which only contain a subset of the objects:
* FreightFrenzy_BC.tflite 0: Ball, 1: Cube
* FreightFrenzy_DM.tflite 0: Duck, 1: Marker
*/
private static final String TFOD_MODEL_ASSET = "FreightFrenzy_BCDM.tflite";
private static final String[] LABELS = {
"Ball",
"Cube",
"Duck",
"Marker"
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
private VuforiaLocalizer vuforia;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
*/
private TFObjectDetector tfod;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can adjust the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(2.5, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
// getUpdatedRecognitions() will return null if no new information is available since
// the last time that call was made.
List<Recognition> updatedRecognitions = tfod.getUpdatedRecognitions();
if (updatedRecognitions != null) {
telemetry.addData("# Object Detected", updatedRecognitions.size());
// step through the list of recognitions and display boundary info.
int i = 0;
for (Recognition recognition : updatedRecognitions) {
telemetry.addData(String.format("label (%d)", i), recognition.getLabel());
telemetry.addData(String.format(" left,top (%d)", i), "%.03f , %.03f",
recognition.getLeft(), recognition.getTop());
telemetry.addData(String.format(" right,bottom (%d)", i), "%.03f , %.03f",
recognition.getRight(), recognition.getBottom());
i++;
}
telemetry.update();
}
}
}
}
}
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraDirection = CameraDirection.BACK;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Loading trackables is not necessary for the TensorFlow Object Detection engine.
}
/**
* Initialize the TensorFlow Object Detection engine.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.8f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 320;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
}
}

View File

@ -0,0 +1,221 @@
/* Copyright (c) 2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.SwitchableCamera;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
/**
* This 2020-2021 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine the position of the Freight Frenzy game elements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Switchable Cameras", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionSwitchableCameras extends LinearOpMode {
/* Note: This sample uses the all-objects Tensor Flow model (FreightFrenzy_BCDM.tflite), which contains
* the following 4 detectable objects
* 0: Ball,
* 1: Cube,
* 2: Duck,
* 3: Marker (duck location tape marker)
*
* Two additional model assets are available which only contain a subset of the objects:
* FreightFrenzy_BC.tflite 0: Ball, 1: Cube
* FreightFrenzy_DM.tflite 0: Duck, 1: Marker
*/
private static final String TFOD_MODEL_ASSET = "FreightFrenzy_BCDM.tflite";
private static final String[] LABELS = {
"Ball",
"Cube",
"Duck",
"Marker"
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
private VuforiaLocalizer vuforia;
/**
* Variables used for switching cameras.
*/
private WebcamName webcam1, webcam2;
private SwitchableCamera switchableCamera;
private boolean oldLeftBumper;
private boolean oldRightBumper;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
*/
private TFObjectDetector tfod;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can adjust the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(2.5, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
doCameraSwitching();
List<Recognition> recognitions = tfod.getRecognitions();
telemetry.addData("# Object Detected", recognitions.size());
// step through the list of recognitions and display boundary info.
int i = 0;
for (Recognition recognition : recognitions) {
telemetry.addData(String.format("label (%d)", i), recognition.getLabel());
telemetry.addData(String.format(" left,top (%d)", i), "%.03f , %.03f",
recognition.getLeft(), recognition.getTop());
telemetry.addData(String.format(" right,bottom (%d)", i), "%.03f , %.03f",
recognition.getRight(), recognition.getBottom());
i++;
}
telemetry.update();
}
}
}
}
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// Indicate that we wish to be able to switch cameras.
webcam1 = hardwareMap.get(WebcamName.class, "Webcam 1");
webcam2 = hardwareMap.get(WebcamName.class, "Webcam 2");
parameters.cameraName = ClassFactory.getInstance().getCameraManager().nameForSwitchableCamera(webcam1, webcam2);
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Set the active camera to Webcam 1.
switchableCamera = (SwitchableCamera) vuforia.getCamera();
switchableCamera.setActiveCamera(webcam1);
// Loading trackables is not necessary for the TensorFlow Object Detection engine.
}
/**
* Initialize the TensorFlow Object Detection engine.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.8f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 320;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
}
private void doCameraSwitching() {
// If the left bumper is pressed, use Webcam 1.
// If the right bumper is pressed, use Webcam 2.
boolean newLeftBumper = gamepad1.left_bumper;
boolean newRightBumper = gamepad1.right_bumper;
if (newLeftBumper && !oldLeftBumper) {
switchableCamera.setActiveCamera(webcam1);
} else if (newRightBumper && !oldRightBumper) {
switchableCamera.setActiveCamera(webcam2);
}
oldLeftBumper = newLeftBumper;
oldRightBumper = newRightBumper;
if (switchableCamera.getActiveCamera().equals(webcam1)) {
telemetry.addData("activeCamera", "Webcam 1");
telemetry.addData("Press RightBumper", "to switch to Webcam 2");
} else {
telemetry.addData("activeCamera", "Webcam 2");
telemetry.addData("Press LeftBumper", "to switch to Webcam 1");
}
}
}

View File

@ -0,0 +1,185 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import java.util.List;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.tfod.TFObjectDetector;
import org.firstinspires.ftc.robotcore.external.tfod.Recognition;
/**
* This 2020-2021 OpMode illustrates the basics of using the TensorFlow Object Detection API to
* determine the position of the Freight Frenzy game elements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name = "Concept: TensorFlow Object Detection Webcam", group = "Concept")
@Disabled
public class ConceptTensorFlowObjectDetectionWebcam extends LinearOpMode {
/* Note: This sample uses the all-objects Tensor Flow model (FreightFrenzy_BCDM.tflite), which contains
* the following 4 detectable objects
* 0: Ball,
* 1: Cube,
* 2: Duck,
* 3: Marker (duck location tape marker)
*
* Two additional model assets are available which only contain a subset of the objects:
* FreightFrenzy_BC.tflite 0: Ball, 1: Cube
* FreightFrenzy_DM.tflite 0: Duck, 1: Marker
*/
private static final String TFOD_MODEL_ASSET = "FreightFrenzy_BCDM.tflite";
private static final String[] LABELS = {
"Ball",
"Cube",
"Duck",
"Marker"
};
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
private VuforiaLocalizer vuforia;
/**
* {@link #tfod} is the variable we will use to store our instance of the TensorFlow Object
* Detection engine.
*/
private TFObjectDetector tfod;
@Override
public void runOpMode() {
// The TFObjectDetector uses the camera frames from the VuforiaLocalizer, so we create that
// first.
initVuforia();
initTfod();
/**
* Activate TensorFlow Object Detection before we wait for the start command.
* Do it here so that the Camera Stream window will have the TensorFlow annotations visible.
**/
if (tfod != null) {
tfod.activate();
// The TensorFlow software will scale the input images from the camera to a lower resolution.
// This can result in lower detection accuracy at longer distances (> 55cm or 22").
// If your target is at distance greater than 50 cm (20") you can adjust the magnification value
// to artificially zoom in to the center of image. For best results, the "aspectRatio" argument
// should be set to the value of the images used to create the TensorFlow Object Detection model
// (typically 16/9).
tfod.setZoom(2.5, 16.0/9.0);
}
/** Wait for the game to begin */
telemetry.addData(">", "Press Play to start op mode");
telemetry.update();
waitForStart();
if (opModeIsActive()) {
while (opModeIsActive()) {
if (tfod != null) {
// getUpdatedRecognitions() will return null if no new information is available since
// the last time that call was made.
List<Recognition> updatedRecognitions = tfod.getUpdatedRecognitions();
if (updatedRecognitions != null) {
telemetry.addData("# Object Detected", updatedRecognitions.size());
// step through the list of recognitions and display boundary info.
int i = 0;
for (Recognition recognition : updatedRecognitions) {
telemetry.addData(String.format("label (%d)", i), recognition.getLabel());
telemetry.addData(String.format(" left,top (%d)", i), "%.03f , %.03f",
recognition.getLeft(), recognition.getTop());
telemetry.addData(String.format(" right,bottom (%d)", i), "%.03f , %.03f",
recognition.getRight(), recognition.getBottom());
i++;
}
telemetry.update();
}
}
}
}
}
/**
* Initialize the Vuforia localization engine.
*/
private void initVuforia() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
*/
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraName = hardwareMap.get(WebcamName.class, "Webcam 1");
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Loading trackables is not necessary for the TensorFlow Object Detection engine.
}
/**
* Initialize the TensorFlow Object Detection engine.
*/
private void initTfod() {
int tfodMonitorViewId = hardwareMap.appContext.getResources().getIdentifier(
"tfodMonitorViewId", "id", hardwareMap.appContext.getPackageName());
TFObjectDetector.Parameters tfodParameters = new TFObjectDetector.Parameters(tfodMonitorViewId);
tfodParameters.minResultConfidence = 0.8f;
tfodParameters.isModelTensorFlow2 = true;
tfodParameters.inputSize = 320;
tfod = ClassFactory.getInstance().createTFObjectDetector(tfodParameters, vuforia);
tfod.loadModelFromAsset(TFOD_MODEL_ASSET, LABELS);
}
}

View File

@ -0,0 +1,186 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.RelicRecoveryVuMark;
import org.firstinspires.ftc.robotcore.external.navigation.VuMarkInstanceId;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates the basics of using the Vuforia engine to determine
* the identity of Vuforia VuMarks encountered on the field. The code is structured as
* a LinearOpMode. It shares much structure with {@link ConceptVuforiaNavigation}; we do not here
* duplicate the core Vuforia documentation found there, but rather instead focus on the
* differences between the use of Vuforia for navigation vs VuMark identification.
*
* @see ConceptVuforiaNavigation
* @see VuforiaLocalizer
* @see VuforiaTrackableDefaultListener
* see ftc_app/doc/tutorial/FTC_FieldCoordinateSystemDefinition.pdf
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained in {@link ConceptVuforiaNavigation}.
*/
@TeleOp(name="Concept: VuMark Id", group ="Concept")
@Disabled
public class ConceptVuMarkIdentification extends LinearOpMode {
public static final String TAG = "Vuforia VuMark Sample";
OpenGLMatrix lastLocation = null;
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
VuforiaLocalizer vuforia;
@Override public void runOpMode() {
/*
* To start up Vuforia, tell it the view that we wish to use for camera monitor (on the RC phone);
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// OR... Do Not Activate the Camera Monitor View, to save power
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
parameters.vuforiaLicenseKey = " -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/*
* We also indicate which camera on the RC that we wish to use.
* Here we chose the back (HiRes) camera (for greater range), but
* for a competition robot, the front camera might be more convenient.
*/
parameters.cameraDirection = VuforiaLocalizer.CameraDirection.BACK;
/**
* Instantiate the Vuforia engine
*/
vuforia = ClassFactory.getInstance().createVuforia(parameters);
/**
* Load the data set containing the VuMarks for Relic Recovery. There's only one trackable
* in this data set: all three of the VuMarks in the game were created from this one template,
* but differ in their instance id information.
* @see VuMarkInstanceId
*/
VuforiaTrackables relicTrackables = this.vuforia.loadTrackablesFromAsset("RelicVuMark");
VuforiaTrackable relicTemplate = relicTrackables.get(0);
relicTemplate.setName("relicVuMarkTemplate"); // can help in debugging; otherwise not necessary
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
relicTrackables.activate();
while (opModeIsActive()) {
/**
* See if any of the instances of {@link relicTemplate} are currently visible.
* {@link RelicRecoveryVuMark} is an enum which can have the following values:
* UNKNOWN, LEFT, CENTER, and RIGHT. When a VuMark is visible, something other than
* UNKNOWN will be returned by {@link RelicRecoveryVuMark#from(VuforiaTrackable)}.
*/
RelicRecoveryVuMark vuMark = RelicRecoveryVuMark.from(relicTemplate);
if (vuMark != RelicRecoveryVuMark.UNKNOWN) {
/* Found an instance of the template. In the actual game, you will probably
* loop until this condition occurs, then move on to act accordingly depending
* on which VuMark was visible. */
telemetry.addData("VuMark", "%s visible", vuMark);
/* For fun, we also exhibit the navigational pose. In the Relic Recovery game,
* it is perhaps unlikely that you will actually need to act on this pose information, but
* we illustrate it nevertheless, for completeness. */
OpenGLMatrix pose = ((VuforiaTrackableDefaultListener)relicTemplate.getListener()).getPose();
telemetry.addData("Pose", format(pose));
/* We further illustrate how to decompose the pose into useful rotational and
* translational components */
if (pose != null) {
VectorF trans = pose.getTranslation();
Orientation rot = Orientation.getOrientation(pose, AxesReference.EXTRINSIC, AxesOrder.XYZ, AngleUnit.DEGREES);
// Extract the X, Y, and Z components of the offset of the target relative to the robot
double tX = trans.get(0);
double tY = trans.get(1);
double tZ = trans.get(2);
// Extract the rotational components of the target relative to the robot
double rX = rot.firstAngle;
double rY = rot.secondAngle;
double rZ = rot.thirdAngle;
}
}
else {
telemetry.addData("VuMark", "not visible");
}
telemetry.update();
}
}
String format(OpenGLMatrix transformationMatrix) {
return (transformationMatrix != null) ? transformationMatrix.formatAsTransform() : "null";
}
}

View File

@ -0,0 +1,194 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.RelicRecoveryVuMark;
import org.firstinspires.ftc.robotcore.external.navigation.VuMarkInstanceId;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates the basics of using the Vuforia engine to determine
* the identity of Vuforia VuMarks encountered on the field. The code is structured as
* a LinearOpMode. It shares much structure with {@link ConceptVuforiaNavigationWebcam}; we do not here
* duplicate the core Vuforia documentation found there, but rather instead focus on the
* differences between the use of Vuforia for navigation vs VuMark identification.
*
* @see ConceptVuforiaNavigationWebcam
* @see VuforiaLocalizer
* @see VuforiaTrackableDefaultListener
* see ftc_app/doc/tutorial/FTC_FieldCoordinateSystemDefinition.pdf
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained in {@link ConceptVuforiaNavigationWebcam}.
*/
@TeleOp(name="Concept: VuMark Id Webcam", group ="Concept")
@Disabled
public class ConceptVuMarkIdentificationWebcam extends LinearOpMode {
public static final String TAG = "Vuforia VuMark Sample";
OpenGLMatrix lastLocation = null;
/**
* {@link #vuforia} is the variable we will use to store our instance of the Vuforia
* localization engine.
*/
VuforiaLocalizer vuforia;
/**
* This is the webcam we are to use. As with other hardware devices such as motors and
* servos, this device is identified using the robot configuration tool in the FTC application.
*/
WebcamName webcamName;
@Override public void runOpMode() {
/*
* Retrieve the camera we are to use.
*/
webcamName = hardwareMap.get(WebcamName.class, "Webcam 1");
/*
* To start up Vuforia, tell it the view that we wish to use for camera monitor (on the RC phone);
* If no camera monitor is desired, use the parameterless constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// OR... Do Not Activate the Camera Monitor View, to save power
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
parameters.vuforiaLicenseKey = " -- YOUR NEW VUFORIA KEY GOES HERE --- ";
/**
* We also indicate which camera on the RC we wish to use. For pedagogical purposes,
* we use the same logic as in {@link ConceptVuforiaNavigationWebcam}.
*/
parameters.cameraName = webcamName;
this.vuforia = ClassFactory.getInstance().createVuforia(parameters);
/**
* Load the data set containing the VuMarks for Relic Recovery. There's only one trackable
* in this data set: all three of the VuMarks in the game were created from this one template,
* but differ in their instance id information.
* @see VuMarkInstanceId
*/
VuforiaTrackables relicTrackables = this.vuforia.loadTrackablesFromAsset("RelicVuMark");
VuforiaTrackable relicTemplate = relicTrackables.get(0);
relicTemplate.setName("relicVuMarkTemplate"); // can help in debugging; otherwise not necessary
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
relicTrackables.activate();
while (opModeIsActive()) {
/**
* See if any of the instances of {@link relicTemplate} are currently visible.
* {@link RelicRecoveryVuMark} is an enum which can have the following values:
* UNKNOWN, LEFT, CENTER, and RIGHT. When a VuMark is visible, something other than
* UNKNOWN will be returned by {@link RelicRecoveryVuMark#from(VuforiaTrackable)}.
*/
RelicRecoveryVuMark vuMark = RelicRecoveryVuMark.from(relicTemplate);
if (vuMark != RelicRecoveryVuMark.UNKNOWN) {
/* Found an instance of the template. In the actual game, you will probably
* loop until this condition occurs, then move on to act accordingly depending
* on which VuMark was visible. */
telemetry.addData("VuMark", "%s visible", vuMark);
/* For fun, we also exhibit the navigational pose. In the Relic Recovery game,
* it is perhaps unlikely that you will actually need to act on this pose information, but
* we illustrate it nevertheless, for completeness. */
OpenGLMatrix pose = ((VuforiaTrackableDefaultListener)relicTemplate.getListener()).getFtcCameraFromTarget();
telemetry.addData("Pose", format(pose));
/* We further illustrate how to decompose the pose into useful rotational and
* translational components */
if (pose != null) {
VectorF trans = pose.getTranslation();
Orientation rot = Orientation.getOrientation(pose, AxesReference.EXTRINSIC, AxesOrder.XYZ, AngleUnit.DEGREES);
// Extract the X, Y, and Z components of the offset of the target relative to the robot
double tX = trans.get(0);
double tY = trans.get(1);
double tZ = trans.get(2);
// Extract the rotational components of the target relative to the robot
double rX = rot.firstAngle;
double rY = rot.secondAngle;
double rZ = rot.thirdAngle;
}
}
else {
telemetry.addData("VuMark", "not visible");
}
telemetry.update();
}
}
String format(OpenGLMatrix transformationMatrix) {
return (transformationMatrix != null) ? transformationMatrix.formatAsTransform() : "null";
}
}

View File

@ -0,0 +1,202 @@
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.Range;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
/**
* This OpMode illustrates using a webcam to locate and drive towards ANY Vuforia target.
* The code assumes a basic two-wheel Robot Configuration with motors named left_drive and right_drive.
* The motor directions must be set so a positive drive goes forward and a positive turn rotates to the right.
*
* Under manual control, the left stick will move forward/back, and the right stick will turn left/right.
* This is called POV Joystick mode, different than Tank Drive (where each joystick controls a wheel).
* Manually drive the robot until it displays Target data on the Driver Station.
* Press and hold the *Left Bumper* to enable the automatic "Drive to target" mode.
* Release the Left Bumper to return to manual driving mode.
*
* Use DESIRED_DISTANCE to set how close you want the robot to get to the target.
* Speed and Turn sensitivity can be adjusted using the SPEED_GAIN and TURN_GAIN constants.
*
* For more Vuforia details, or to adapt this OpMode for a phone camera, view the
* ConceptVuforiaFieldNavigation and ConceptVuforiaFieldNavigationWebcam samples.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Drive To Target", group = "Concept")
@Disabled
public class ConceptVuforiaDriveToTargetWebcam extends LinearOpMode
{
// Adjust these numbers to suit your robot.
final double DESIRED_DISTANCE = 8.0; // this is how close the camera should get to the target (inches)
// The GAIN constants set the relationship between the measured position error,
// and how much power is applied to the drive motors. Drive = Error * Gain
// Make these values smaller for smoother control.
final double SPEED_GAIN = 0.02 ; // Speed Control "Gain". eg: Ramp up to 50% power at a 25 inch error. (0.50 / 25.0)
final double TURN_GAIN = 0.01 ; // Turn Control "Gain". eg: Ramp up to 25% power at a 25 degree error. (0.25 / 25.0)
final double MM_PER_INCH = 25.40 ; // Metric conversion
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" --- YOUR NEW VUFORIA KEY GOES HERE --- ";
VuforiaLocalizer vuforia = null;
OpenGLMatrix targetPose = null;
String targetName = "";
private DcMotor leftDrive = null;
private DcMotor rightDrive = null;
@Override public void runOpMode()
{
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* To get an on-phone camera preview, use the code below.
* If no camera preview is desired, use the parameter-less constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Connect to the camera we are to use. This name must match what is set up in Robot Configuration
parameters.cameraName = hardwareMap.get(WebcamName.class, "Webcam 1");
this.vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the trackable objects from the Assets file, and give them meaningful names
VuforiaTrackables targetsFreightFrenzy = this.vuforia.loadTrackablesFromAsset("FreightFrenzy");
targetsFreightFrenzy.get(0).setName("Blue Storage");
targetsFreightFrenzy.get(1).setName("Blue Alliance Wall");
targetsFreightFrenzy.get(2).setName("Red Storage");
targetsFreightFrenzy.get(3).setName("Red Alliance Wall");
// Start tracking targets in the background
targetsFreightFrenzy.activate();
// Initialize the hardware variables. Note that the strings used here as parameters
// to 'get' must correspond to the names assigned during the robot configuration
// step (using the FTC Robot Controller app on the phone).
leftDrive = hardwareMap.get(DcMotor.class, "left_drive");
rightDrive = hardwareMap.get(DcMotor.class, "right_drive");
// To drive forward, most robots need the motor on one side to be reversed, because the axles point in opposite directions.
// Pushing the left stick forward MUST make robot go forward. So adjust these two lines based on your first test drive.
leftDrive.setDirection(DcMotor.Direction.FORWARD);
rightDrive.setDirection(DcMotor.Direction.REVERSE);
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
boolean targetFound = false; // Set to true when a target is detected by Vuforia
double targetRange = 0; // Distance from camera to target in Inches
double targetBearing = 0; // Robot Heading, relative to target. Positive degrees means target is to the right.
double drive = 0; // Desired forward power (-1 to +1)
double turn = 0; // Desired turning power (-1 to +1)
while (opModeIsActive())
{
// Look for first visible target, and save its pose.
targetFound = false;
for (VuforiaTrackable trackable : targetsFreightFrenzy)
{
if (((VuforiaTrackableDefaultListener) trackable.getListener()).isVisible())
{
targetPose = ((VuforiaTrackableDefaultListener)trackable.getListener()).getVuforiaCameraFromTarget();
// if we have a target, process the "pose" to determine the position of the target relative to the robot.
if (targetPose != null)
{
targetFound = true;
targetName = trackable.getName();
VectorF trans = targetPose.getTranslation();
// Extract the X & Y components of the offset of the target relative to the robot
double targetX = trans.get(0) / MM_PER_INCH; // Image X axis
double targetY = trans.get(2) / MM_PER_INCH; // Image Z axis
// target range is based on distance from robot position to origin (right triangle).
targetRange = Math.hypot(targetX, targetY);
// target bearing is based on angle formed between the X axis and the target range line
targetBearing = Math.toDegrees(Math.asin(targetX / targetRange));
break; // jump out of target tracking loop if we find a target.
}
}
}
// Tell the driver what we see, and what to do.
if (targetFound) {
telemetry.addData(">","HOLD Left-Bumper to Drive to Target\n");
telemetry.addData("Target", " %s", targetName);
telemetry.addData("Range", "%5.1f inches", targetRange);
telemetry.addData("Bearing","%3.0f degrees", targetBearing);
} else {
telemetry.addData(">","Drive using joystick to find target\n");
}
// Drive to target Automatically if Left Bumper is being pressed, AND we have found a target.
if (gamepad1.left_bumper && targetFound) {
// Determine heading and range error so we can use them to control the robot automatically.
double rangeError = (targetRange - DESIRED_DISTANCE);
double headingError = targetBearing;
// Use the speed and turn "gains" to calculate how we want the robot to move.
drive = rangeError * SPEED_GAIN;
turn = headingError * TURN_GAIN ;
telemetry.addData("Auto","Drive %5.2f, Turn %5.2f", drive, turn);
} else {
// drive using manual POV Joystick mode.
drive = -gamepad1.left_stick_y / 2.0; // Reduce drive rate to 50%.
turn = gamepad1.right_stick_x / 4.0; // Reduce turn rate to 25%.
telemetry.addData("Manual","Drive %5.2f, Turn %5.2f", drive, turn);
}
telemetry.update();
// Calculate left and right wheel powers and send to them to the motors.
double leftPower = Range.clip(drive + turn, -1.0, 1.0) ;
double rightPower = Range.clip(drive - turn, -1.0, 1.0) ;
leftDrive.setPower(leftPower);
rightDrive.setPower(rightPower);
sleep(10);
}
}
}

View File

@ -0,0 +1,284 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
import java.util.ArrayList;
import java.util.List;
import static org.firstinspires.ftc.robotcore.external.navigation.AngleUnit.DEGREES;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XYZ;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.YZX;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesReference.EXTRINSIC;
import static org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer.CameraDirection.BACK;
/**
* This OpMode illustrates using the Vuforia localizer to determine positioning and orientation of
* robot on the FTC field using the RC phone's camera. The code is structured as a LinearOpMode
*
* Note: If you are using a WEBCAM see ConceptVuforiaFieldNavigationWebcam.java
*
* When images are located, Vuforia is able to determine the position and orientation of the
* image relative to the camera. This sample code then combines that information with a
* knowledge of where the target images are on the field, to determine the location of the camera.
*
* Finally, the location of the camera on the robot is used to determine the
* robot's location and orientation on the field.
*
* To learn more about the FTC field coordinate model, see FTC_FieldCoordinateSystemDefinition.pdf in this folder
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Vuforia Field Nav", group ="Concept")
@Disabled
public class ConceptVuforiaFieldNavigation extends LinearOpMode {
// IMPORTANT: For Phone Camera, set 1) the camera source and 2) the orientation, based on how your phone is mounted:
// 1) Camera Source. Valid choices are: BACK (behind screen) or FRONT (selfie side)
// 2) Phone Orientation. Choices are: PHONE_IS_PORTRAIT = true (portrait) or PHONE_IS_PORTRAIT = false (landscape)
private static final VuforiaLocalizer.CameraDirection CAMERA_CHOICE = BACK;
private static final boolean PHONE_IS_PORTRAIT = false ;
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" -- YOUR NEW VUFORIA KEY GOES HERE --- ";
// Since ImageTarget trackables use mm to specifiy their dimensions, we must use mm for all the physical dimension.
// We will define some constants and conversions here. These are useful for the Freight Frenzy field.
private static final float mmPerInch = 25.4f;
private static final float mmTargetHeight = 6 * mmPerInch; // the height of the center of the target image above the floor
private static final float halfField = 72 * mmPerInch;
private static final float halfTile = 12 * mmPerInch;
private static final float oneAndHalfTile = 36 * mmPerInch;
// Class Members
private OpenGLMatrix lastLocation = null;
private VuforiaLocalizer vuforia = null;
private VuforiaTrackables targets = null ;
private boolean targetVisible = false;
private float phoneXRotate = 0;
private float phoneYRotate = 0;
private float phoneZRotate = 0;
@Override public void runOpMode() {
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* To get an on-phone camera preview, use the code below.
* If no camera preview is desired, use the parameter-less constructor instead (commented out below).
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
parameters.cameraDirection = CAMERA_CHOICE;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the data sets for the trackable objects. These particular data
// sets are stored in the 'assets' part of our application.
targets = this.vuforia.loadTrackablesFromAsset("FreightFrenzy");
// For convenience, gather together all the trackable objects in one easily-iterable collection */
List<VuforiaTrackable> allTrackables = new ArrayList<VuforiaTrackable>();
allTrackables.addAll(targets);
/**
* In order for localization to work, we need to tell the system where each target is on the field, and
* where the phone resides on the robot. These specifications are in the form of <em>transformation matrices.</em>
* Transformation matrices are a central, important concept in the math here involved in localization.
* See <a href="https://en.wikipedia.org/wiki/Transformation_matrix">Transformation Matrix</a>
* for detailed information. Commonly, you'll encounter transformation matrices as instances
* of the {@link OpenGLMatrix} class.
*
* If you are standing in the Red Alliance Station looking towards the center of the field,
* - The X axis runs from your left to the right. (positive from the center to the right)
* - The Y axis runs from the Red Alliance Station towards the other side of the field
* where the Blue Alliance Station is. (Positive is from the center, towards the BlueAlliance station)
* - The Z axis runs from the floor, upwards towards the ceiling. (Positive is above the floor)
*
* Before being transformed, each target image is conceptually located at the origin of the field's
* coordinate system (the center of the field), facing up.
*/
// Name and locate each trackable object
identifyTarget(0, "Blue Storage", -halfField, oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(1, "Blue Alliance Wall", halfTile, halfField, mmTargetHeight, 90, 0, 0);
identifyTarget(2, "Red Storage", -halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(3, "Red Alliance Wall", halfTile, -halfField, mmTargetHeight, 90, 0, 180);
/*
* Create a transformation matrix describing where the phone is on the robot.
*
* NOTE !!!! It's very important that you turn OFF your phone's Auto-Screen-Rotation option.
* Lock it into Portrait for these numbers to work.
*
* Info: The coordinate frame for the robot looks the same as the field.
* The robot's "forward" direction is facing out along X axis, with the LEFT side facing out along the Y axis.
* Z is UP on the robot. This equates to a heading angle of Zero degrees.
*
* The phone starts out lying flat, with the screen facing Up and with the physical top of the phone
* pointing to the LEFT side of the Robot.
* The two examples below assume that the camera is facing forward out the front of the robot.
*/
// We need to rotate the camera around its long axis to bring the correct camera forward.
if (CAMERA_CHOICE == BACK) {
phoneYRotate = -90;
} else {
phoneYRotate = 90;
}
// Rotate the phone vertical about the X axis if it's in portrait mode
if (PHONE_IS_PORTRAIT) {
phoneXRotate = 90 ;
}
// Next, translate the camera lens to where it is on the robot.
// In this example, it is centered on the robot (left-to-right and front-to-back), and 6 inches above ground level.
final float CAMERA_FORWARD_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the forward distance from the center of the robot to the camera lens
final float CAMERA_VERTICAL_DISPLACEMENT = 6.0f * mmPerInch; // eg: Camera is 6 Inches above ground
final float CAMERA_LEFT_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the left distance from the center of the robot to the camera lens
OpenGLMatrix robotFromCamera = OpenGLMatrix
.translation(CAMERA_FORWARD_DISPLACEMENT, CAMERA_LEFT_DISPLACEMENT, CAMERA_VERTICAL_DISPLACEMENT)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, YZX, DEGREES, phoneYRotate, phoneZRotate, phoneXRotate));
/** Let all the trackable listeners know where the phone is. */
for (VuforiaTrackable trackable : allTrackables) {
((VuforiaTrackableDefaultListener) trackable.getListener()).setPhoneInformation(robotFromCamera, parameters.cameraDirection);
}
/*
* WARNING:
* In this sample, we do not wait for PLAY to be pressed. Target Tracking is started immediately when INIT is pressed.
* This sequence is used to enable the new remote DS Camera Stream feature to be used with this sample.
* CONSEQUENTLY do not put any driving commands in this loop.
* To restore the normal opmode structure, just un-comment the following line:
*/
// waitForStart();
/* Note: To use the remote camera preview:
* AFTER you hit Init on the Driver Station, use the "options menu" to select "Camera Stream"
* Tap the preview window to receive a fresh image.
* It is not permitted to transition to RUN while the camera preview window is active.
* Either press STOP to exit the OpMode, or use the "options menu" again, and select "Camera Stream" to close the preview window.
*/
targets.activate();
while (!isStopRequested()) {
// check all the trackable targets to see which one (if any) is visible.
targetVisible = false;
for (VuforiaTrackable trackable : allTrackables) {
if (((VuforiaTrackableDefaultListener)trackable.getListener()).isVisible()) {
telemetry.addData("Visible Target", trackable.getName());
targetVisible = true;
// getUpdatedRobotLocation() will return null if no new information is available since
// the last time that call was made, or if the trackable is not currently visible.
OpenGLMatrix robotLocationTransform = ((VuforiaTrackableDefaultListener)trackable.getListener()).getUpdatedRobotLocation();
if (robotLocationTransform != null) {
lastLocation = robotLocationTransform;
}
break;
}
}
// Provide feedback as to where the robot is located (if we know).
if (targetVisible) {
// express position (translation) of robot in inches.
VectorF translation = lastLocation.getTranslation();
telemetry.addData("Pos (inches)", "{X, Y, Z} = %.1f, %.1f, %.1f",
translation.get(0) / mmPerInch, translation.get(1) / mmPerInch, translation.get(2) / mmPerInch);
// express the rotation of the robot in degrees.
Orientation rotation = Orientation.getOrientation(lastLocation, EXTRINSIC, XYZ, DEGREES);
telemetry.addData("Rot (deg)", "{Roll, Pitch, Heading} = %.0f, %.0f, %.0f", rotation.firstAngle, rotation.secondAngle, rotation.thirdAngle);
}
else {
telemetry.addData("Visible Target", "none");
}
telemetry.update();
}
// Disable Tracking when we are done;
targets.deactivate();
}
/***
* Identify a target by naming it, and setting its position and orientation on the field
* @param targetIndex
* @param targetName
* @param dx, dy, dz Target offsets in x,y,z axes
* @param rx, ry, rz Target rotations in x,y,z axes
*/
void identifyTarget(int targetIndex, String targetName, float dx, float dy, float dz, float rx, float ry, float rz) {
VuforiaTrackable aTarget = targets.get(targetIndex);
aTarget.setName(targetName);
aTarget.setLocation(OpenGLMatrix.translation(dx, dy, dz)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XYZ, DEGREES, rx, ry, rz)));
}
}

View File

@ -0,0 +1,274 @@
/* Copyright (c) 2019 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.external.matrices.OpenGLMatrix;
import org.firstinspires.ftc.robotcore.external.matrices.VectorF;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaLocalizer;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackable;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackableDefaultListener;
import org.firstinspires.ftc.robotcore.external.navigation.VuforiaTrackables;
import java.util.ArrayList;
import java.util.List;
import static org.firstinspires.ftc.robotcore.external.navigation.AngleUnit.DEGREES;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XYZ;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesOrder.XZY;
import static org.firstinspires.ftc.robotcore.external.navigation.AxesReference.EXTRINSIC;
/**
* This OpMode illustrates using the Vuforia localizer to determine positioning and orientation of
* robot on the FTC field using a WEBCAM. The code is structured as a LinearOpMode
*
* NOTE: If you are running on a Phone with a built-in camera, use the ConceptVuforiaFieldNavigation example instead of this one.
* NOTE: It is possible to switch between multiple WebCams (eg: one for the left side and one for the right).
* For a related example of how to do this, see ConceptTensorFlowObjectDetectionSwitchableCameras
*
* When images are located, Vuforia is able to determine the position and orientation of the
* image relative to the camera. This sample code then combines that information with a
* knowledge of where the target images are on the field, to determine the location of the camera.
*
* Finally, the location of the camera on the robot is used to determine the
* robot's location and orientation on the field.
*
* To learn more about the FTC field coordinate model, see FTC_FieldCoordinateSystemDefinition.pdf in this folder
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*
* IMPORTANT: In order to use this OpMode, you need to obtain your own Vuforia license key as
* is explained below.
*/
@TeleOp(name="Vuforia Field Nav Webcam", group ="Concept")
@Disabled
public class ConceptVuforiaFieldNavigationWebcam extends LinearOpMode {
/*
* IMPORTANT: You need to obtain your own license key to use Vuforia. The string below with which
* 'parameters.vuforiaLicenseKey' is initialized is for illustration only, and will not function.
* A Vuforia 'Development' license key, can be obtained free of charge from the Vuforia developer
* web site at https://developer.vuforia.com/license-manager.
*
* Vuforia license keys are always 380 characters long, and look as if they contain mostly
* random data. As an example, here is a example of a fragment of a valid key:
* ... yIgIzTqZ4mWjk9wd3cZO9T1axEqzuhxoGlfOOI2dRzKS4T0hQ8kT ...
* Once you've obtained a license key, copy the string from the Vuforia web site
* and paste it in to your code on the next line, between the double quotes.
*/
private static final String VUFORIA_KEY =
" --- YOUR NEW VUFORIA KEY GOES HERE --- ";
// Since ImageTarget trackables use mm to specifiy their dimensions, we must use mm for all the physical dimension.
// We will define some constants and conversions here
private static final float mmPerInch = 25.4f;
private static final float mmTargetHeight = 6 * mmPerInch; // the height of the center of the target image above the floor
private static final float halfField = 72 * mmPerInch;
private static final float halfTile = 12 * mmPerInch;
private static final float oneAndHalfTile = 36 * mmPerInch;
// Class Members
private OpenGLMatrix lastLocation = null;
private VuforiaLocalizer vuforia = null;
private VuforiaTrackables targets = null ;
private WebcamName webcamName = null;
private boolean targetVisible = false;
@Override public void runOpMode() {
// Connect to the camera we are to use. This name must match what is set up in Robot Configuration
webcamName = hardwareMap.get(WebcamName.class, "Webcam 1");
/*
* Configure Vuforia by creating a Parameter object, and passing it to the Vuforia engine.
* We can pass Vuforia the handle to a camera preview resource (on the RC screen);
* If no camera-preview is desired, use the parameter-less constructor instead (commented out below).
* Note: A preview window is required if you want to view the camera stream on the Driver Station Phone.
*/
int cameraMonitorViewId = hardwareMap.appContext.getResources().getIdentifier("cameraMonitorViewId", "id", hardwareMap.appContext.getPackageName());
VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters(cameraMonitorViewId);
// VuforiaLocalizer.Parameters parameters = new VuforiaLocalizer.Parameters();
parameters.vuforiaLicenseKey = VUFORIA_KEY;
// We also indicate which camera we wish to use.
parameters.cameraName = webcamName;
// Turn off Extended tracking. Set this true if you want Vuforia to track beyond the target.
parameters.useExtendedTracking = false;
// Instantiate the Vuforia engine
vuforia = ClassFactory.getInstance().createVuforia(parameters);
// Load the data sets for the trackable objects. These particular data
// sets are stored in the 'assets' part of our application.
targets = this.vuforia.loadTrackablesFromAsset("FreightFrenzy");
// For convenience, gather together all the trackable objects in one easily-iterable collection */
List<VuforiaTrackable> allTrackables = new ArrayList<VuforiaTrackable>();
allTrackables.addAll(targets);
/**
* In order for localization to work, we need to tell the system where each target is on the field, and
* where the phone resides on the robot. These specifications are in the form of <em>transformation matrices.</em>
* Transformation matrices are a central, important concept in the math here involved in localization.
* See <a href="https://en.wikipedia.org/wiki/Transformation_matrix">Transformation Matrix</a>
* for detailed information. Commonly, you'll encounter transformation matrices as instances
* of the {@link OpenGLMatrix} class.
*
* If you are standing in the Red Alliance Station looking towards the center of the field,
* - The X axis runs from your left to the right. (positive from the center to the right)
* - The Y axis runs from the Red Alliance Station towards the other side of the field
* where the Blue Alliance Station is. (Positive is from the center, towards the BlueAlliance station)
* - The Z axis runs from the floor, upwards towards the ceiling. (Positive is above the floor)
*
* Before being transformed, each target image is conceptually located at the origin of the field's
* coordinate system (the center of the field), facing up.
*/
// Name and locate each trackable object
identifyTarget(0, "Blue Storage", -halfField, oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(1, "Blue Alliance Wall", halfTile, halfField, mmTargetHeight, 90, 0, 0);
identifyTarget(2, "Red Storage", -halfField, -oneAndHalfTile, mmTargetHeight, 90, 0, 90);
identifyTarget(3, "Red Alliance Wall", halfTile, -halfField, mmTargetHeight, 90, 0, 180);
/*
* Create a transformation matrix describing where the camera is on the robot.
*
* Info: The coordinate frame for the robot looks the same as the field.
* The robot's "forward" direction is facing out along X axis, with the LEFT side facing out along the Y axis.
* Z is UP on the robot. This equates to a bearing angle of Zero degrees.
*
* For a WebCam, the default starting orientation of the camera is looking UP (pointing in the Z direction),
* with the wide (horizontal) axis of the camera aligned with the X axis, and
* the Narrow (vertical) axis of the camera aligned with the Y axis
*
* But, this example assumes that the camera is actually facing forward out the front of the robot.
* So, the "default" camera position requires two rotations to get it oriented correctly.
* 1) First it must be rotated +90 degrees around the X axis to get it horizontal (its now facing out the right side of the robot)
* 2) Next it must be be rotated +90 degrees (counter-clockwise) around the Z axis to face forward.
*
* Finally the camera can be translated to its actual mounting position on the robot.
* In this example, it is centered on the robot (left-to-right and front-to-back), and 6 inches above ground level.
*/
final float CAMERA_FORWARD_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the forward distance from the center of the robot to the camera lens
final float CAMERA_VERTICAL_DISPLACEMENT = 6.0f * mmPerInch; // eg: Camera is 6 Inches above ground
final float CAMERA_LEFT_DISPLACEMENT = 0.0f * mmPerInch; // eg: Enter the left distance from the center of the robot to the camera lens
OpenGLMatrix cameraLocationOnRobot = OpenGLMatrix
.translation(CAMERA_FORWARD_DISPLACEMENT, CAMERA_LEFT_DISPLACEMENT, CAMERA_VERTICAL_DISPLACEMENT)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XZY, DEGREES, 90, 90, 0));
/** Let all the trackable listeners know where the camera is. */
for (VuforiaTrackable trackable : allTrackables) {
((VuforiaTrackableDefaultListener) trackable.getListener()).setCameraLocationOnRobot(parameters.cameraName, cameraLocationOnRobot);
}
/*
* WARNING:
* In this sample, we do not wait for PLAY to be pressed. Target Tracking is started immediately when INIT is pressed.
* This sequence is used to enable the new remote DS Camera Preview feature to be used with this sample.
* CONSEQUENTLY do not put any driving commands in this loop.
* To restore the normal opmode structure, just un-comment the following line:
*/
// waitForStart();
/* Note: To use the remote camera preview:
* AFTER you hit Init on the Driver Station, use the "options menu" to select "Camera Stream"
* Tap the preview window to receive a fresh image.
* It is not permitted to transition to RUN while the camera preview window is active.
* Either press STOP to exit the OpMode, or use the "options menu" again, and select "Camera Stream" to close the preview window.
*/
targets.activate();
while (!isStopRequested()) {
// check all the trackable targets to see which one (if any) is visible.
targetVisible = false;
for (VuforiaTrackable trackable : allTrackables) {
if (((VuforiaTrackableDefaultListener)trackable.getListener()).isVisible()) {
telemetry.addData("Visible Target", trackable.getName());
targetVisible = true;
// getUpdatedRobotLocation() will return null if no new information is available since
// the last time that call was made, or if the trackable is not currently visible.
OpenGLMatrix robotLocationTransform = ((VuforiaTrackableDefaultListener)trackable.getListener()).getUpdatedRobotLocation();
if (robotLocationTransform != null) {
lastLocation = robotLocationTransform;
}
break;
}
}
// Provide feedback as to where the robot is located (if we know).
if (targetVisible) {
// express position (translation) of robot in inches.
VectorF translation = lastLocation.getTranslation();
telemetry.addData("Pos (inches)", "{X, Y, Z} = %.1f, %.1f, %.1f",
translation.get(0) / mmPerInch, translation.get(1) / mmPerInch, translation.get(2) / mmPerInch);
// express the rotation of the robot in degrees.
Orientation rotation = Orientation.getOrientation(lastLocation, EXTRINSIC, XYZ, DEGREES);
telemetry.addData("Rot (deg)", "{Roll, Pitch, Heading} = %.0f, %.0f, %.0f", rotation.firstAngle, rotation.secondAngle, rotation.thirdAngle);
}
else {
telemetry.addData("Visible Target", "none");
}
telemetry.update();
}
// Disable Tracking when we are done;
targets.deactivate();
}
/***
* Identify a target by naming it, and setting its position and orientation on the field
* @param targetIndex
* @param targetName
* @param dx, dy, dz Target offsets in x,y,z axes
* @param rx, ry, rz Target rotations in x,y,z axes
*/
void identifyTarget(int targetIndex, String targetName, float dx, float dy, float dz, float rx, float ry, float rz) {
VuforiaTrackable aTarget = targets.get(targetIndex);
aTarget.setName(targetName);
aTarget.setLocation(OpenGLMatrix.translation(dx, dy, dz)
.multiplied(Orientation.getRotationMatrix(EXTRINSIC, XYZ, DEGREES, rx, ry, rz)));
}
}

View File

@ -0,0 +1,310 @@
/* Copyright (c) 2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.graphics.Bitmap;
import android.graphics.ImageFormat;
import android.os.Handler;
import androidx.annotation.NonNull;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.RobotLog;
import org.firstinspires.ftc.robotcore.external.ClassFactory;
import org.firstinspires.ftc.robotcore.external.android.util.Size;
import org.firstinspires.ftc.robotcore.external.function.Consumer;
import org.firstinspires.ftc.robotcore.external.function.Continuation;
import org.firstinspires.ftc.robotcore.external.hardware.camera.Camera;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraCaptureRequest;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraCaptureSequenceId;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraCaptureSession;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraCharacteristics;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraException;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraFrame;
import org.firstinspires.ftc.robotcore.external.hardware.camera.CameraManager;
import org.firstinspires.ftc.robotcore.external.hardware.camera.WebcamName;
import org.firstinspires.ftc.robotcore.internal.collections.EvictingBlockingQueue;
import org.firstinspires.ftc.robotcore.internal.network.CallbackLooper;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import org.firstinspires.ftc.robotcore.internal.system.ContinuationSynchronizer;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.util.Locale;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.TimeUnit;
/**
* This OpMode illustrates how to open a webcam and retrieve images from it. It requires a configuration
* containing a webcam with the default name ("Webcam 1"). When the opmode runs, pressing the 'A' button
* will cause a frame from the camera to be written to a file on the device, which can then be retrieved
* by various means (e.g.: Device File Explorer in Android Studio; plugging the device into a PC and
* using Media Transfer; ADB; etc)
*/
@TeleOp(name="Concept: Webcam", group ="Concept")
@Disabled
public class ConceptWebcam extends LinearOpMode {
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
private static final String TAG = "Webcam Sample";
/** How long we are to wait to be granted permission to use the camera before giving up. Here,
* we wait indefinitely */
private static final int secondsPermissionTimeout = Integer.MAX_VALUE;
/** State regarding our interaction with the camera */
private CameraManager cameraManager;
private WebcamName cameraName;
private Camera camera;
private CameraCaptureSession cameraCaptureSession;
/** The queue into which all frames from the camera are placed as they become available.
* Frames which are not processed by the OpMode are automatically discarded. */
private EvictingBlockingQueue<Bitmap> frameQueue;
/** State regarding where and how to save frames when the 'A' button is pressed. */
private int captureCounter = 0;
private File captureDirectory = AppUtil.ROBOT_DATA_DIR;
/** A utility object that indicates where the asynchronous callbacks from the camera
* infrastructure are to run. In this OpMode, that's all hidden from you (but see {@link #startCamera}
* if you're curious): no knowledge of multi-threading is needed here. */
private Handler callbackHandler;
//----------------------------------------------------------------------------------------------
// Main OpMode entry
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
callbackHandler = CallbackLooper.getDefault().getHandler();
cameraManager = ClassFactory.getInstance().getCameraManager();
cameraName = hardwareMap.get(WebcamName.class, "Webcam 1");
initializeFrameQueue(2);
AppUtil.getInstance().ensureDirectoryExists(captureDirectory);
try {
openCamera();
if (camera == null) return;
startCamera();
if (cameraCaptureSession == null) return;
telemetry.addData(">", "Press Play to start");
telemetry.update();
waitForStart();
telemetry.clear();
telemetry.addData(">", "Started...Press 'A' to capture frame");
boolean buttonPressSeen = false;
boolean captureWhenAvailable = false;
while (opModeIsActive()) {
boolean buttonIsPressed = gamepad1.a;
if (buttonIsPressed && !buttonPressSeen) {
captureWhenAvailable = true;
}
buttonPressSeen = buttonIsPressed;
if (captureWhenAvailable) {
Bitmap bmp = frameQueue.poll();
if (bmp != null) {
captureWhenAvailable = false;
onNewFrame(bmp);
}
}
telemetry.update();
}
} finally {
closeCamera();
}
}
/** Do something with the frame */
private void onNewFrame(Bitmap frame) {
saveBitmap(frame);
frame.recycle(); // not strictly necessary, but helpful
}
//----------------------------------------------------------------------------------------------
// Camera operations
//----------------------------------------------------------------------------------------------
private void initializeFrameQueue(int capacity) {
/** The frame queue will automatically throw away bitmap frames if they are not processed
* quickly by the OpMode. This avoids a buildup of frames in memory */
frameQueue = new EvictingBlockingQueue<Bitmap>(new ArrayBlockingQueue<Bitmap>(capacity));
frameQueue.setEvictAction(new Consumer<Bitmap>() {
@Override public void accept(Bitmap frame) {
// RobotLog.ii(TAG, "frame recycled w/o processing");
frame.recycle(); // not strictly necessary, but helpful
}
});
}
private void openCamera() {
if (camera != null) return; // be idempotent
Deadline deadline = new Deadline(secondsPermissionTimeout, TimeUnit.SECONDS);
camera = cameraManager.requestPermissionAndOpenCamera(deadline, cameraName, null);
if (camera == null) {
error("camera not found or permission to use not granted: %s", cameraName);
}
}
private void startCamera() {
if (cameraCaptureSession != null) return; // be idempotent
/** YUY2 is supported by all Webcams, per the USB Webcam standard: See "USB Device Class Definition
* for Video Devices: Uncompressed Payload, Table 2-1". Further, often this is the *only*
* image format supported by a camera */
final int imageFormat = ImageFormat.YUY2;
/** Verify that the image is supported, and fetch size and desired frame rate if so */
CameraCharacteristics cameraCharacteristics = cameraName.getCameraCharacteristics();
if (!contains(cameraCharacteristics.getAndroidFormats(), imageFormat)) {
error("image format not supported");
return;
}
final Size size = cameraCharacteristics.getDefaultSize(imageFormat);
final int fps = cameraCharacteristics.getMaxFramesPerSecond(imageFormat, size);
/** Some of the logic below runs asynchronously on other threads. Use of the synchronizer
* here allows us to wait in this method until all that asynchrony completes before returning. */
final ContinuationSynchronizer<CameraCaptureSession> synchronizer = new ContinuationSynchronizer<>();
try {
/** Create a session in which requests to capture frames can be made */
camera.createCaptureSession(Continuation.create(callbackHandler, new CameraCaptureSession.StateCallbackDefault() {
@Override public void onConfigured(@NonNull CameraCaptureSession session) {
try {
/** The session is ready to go. Start requesting frames */
final CameraCaptureRequest captureRequest = camera.createCaptureRequest(imageFormat, size, fps);
session.startCapture(captureRequest,
new CameraCaptureSession.CaptureCallback() {
@Override public void onNewFrame(@NonNull CameraCaptureSession session, @NonNull CameraCaptureRequest request, @NonNull CameraFrame cameraFrame) {
/** A new frame is available. The frame data has <em>not</em> been copied for us, and we can only access it
* for the duration of the callback. So we copy here manually. */
Bitmap bmp = captureRequest.createEmptyBitmap();
cameraFrame.copyToBitmap(bmp);
frameQueue.offer(bmp);
}
},
Continuation.create(callbackHandler, new CameraCaptureSession.StatusCallback() {
@Override public void onCaptureSequenceCompleted(@NonNull CameraCaptureSession session, CameraCaptureSequenceId cameraCaptureSequenceId, long lastFrameNumber) {
RobotLog.ii(TAG, "capture sequence %s reports completed: lastFrame=%d", cameraCaptureSequenceId, lastFrameNumber);
}
})
);
synchronizer.finish(session);
} catch (CameraException|RuntimeException e) {
RobotLog.ee(TAG, e, "exception starting capture");
error("exception starting capture");
session.close();
synchronizer.finish(null);
}
}
}));
} catch (CameraException|RuntimeException e) {
RobotLog.ee(TAG, e, "exception starting camera");
error("exception starting camera");
synchronizer.finish(null);
}
/** Wait for all the asynchrony to complete */
try {
synchronizer.await();
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
/** Retrieve the created session. This will be null on error. */
cameraCaptureSession = synchronizer.getValue();
}
private void stopCamera() {
if (cameraCaptureSession != null) {
cameraCaptureSession.stopCapture();
cameraCaptureSession.close();
cameraCaptureSession = null;
}
}
private void closeCamera() {
stopCamera();
if (camera != null) {
camera.close();
camera = null;
}
}
//----------------------------------------------------------------------------------------------
// Utilities
//----------------------------------------------------------------------------------------------
private void error(String msg) {
telemetry.log().add(msg);
telemetry.update();
}
private void error(String format, Object...args) {
telemetry.log().add(format, args);
telemetry.update();
}
private boolean contains(int[] array, int value) {
for (int i : array) {
if (i == value) return true;
}
return false;
}
private void saveBitmap(Bitmap bitmap) {
File file = new File(captureDirectory, String.format(Locale.getDefault(), "webcam-frame-%d.jpg", captureCounter++));
try {
try (FileOutputStream outputStream = new FileOutputStream(file)) {
bitmap.compress(Bitmap.CompressFormat.JPEG, 100, outputStream);
telemetry.log().add("captured %s", file.getName());
}
} catch (IOException e) {
RobotLog.ee(TAG, e, "exception in saveBitmap()");
error("exception saving %s", file.getName());
}
}
}

View File

@ -0,0 +1,105 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.HardwareMap;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This is NOT an opmode.
*
* This class can be used to define all the specific hardware for a single robot.
* In this case that robot is a Pushbot.
* See PushbotTeleopTank_Iterative and others classes starting with "Pushbot" for usage examples.
*
* This hardware class assumes the following device names have been configured on the robot:
* Note: All names are lower case and some have single spaces between words.
*
* Motor channel: Left drive motor: "left_drive"
* Motor channel: Right drive motor: "right_drive"
* Motor channel: Manipulator drive motor: "left_arm"
* Servo channel: Servo to open left claw: "left_hand"
* Servo channel: Servo to open right claw: "right_hand"
*/
public class HardwarePushbot
{
/* Public OpMode members. */
public DcMotor leftDrive = null;
public DcMotor rightDrive = null;
public DcMotor leftArm = null;
public Servo leftClaw = null;
public Servo rightClaw = null;
public static final double MID_SERVO = 0.5 ;
public static final double ARM_UP_POWER = 0.45 ;
public static final double ARM_DOWN_POWER = -0.45 ;
/* local OpMode members. */
HardwareMap hwMap = null;
private ElapsedTime period = new ElapsedTime();
/* Constructor */
public HardwarePushbot(){
}
/* Initialize standard Hardware interfaces */
public void init(HardwareMap ahwMap) {
// Save reference to Hardware map
hwMap = ahwMap;
// Define and Initialize Motors
leftDrive = hwMap.get(DcMotor.class, "left_drive");
rightDrive = hwMap.get(DcMotor.class, "right_drive");
leftArm = hwMap.get(DcMotor.class, "left_arm");
leftDrive.setDirection(DcMotor.Direction.FORWARD); // Set to REVERSE if using AndyMark motors
rightDrive.setDirection(DcMotor.Direction.REVERSE);// Set to FORWARD if using AndyMark motors
// Set all motors to zero power
leftDrive.setPower(0);
rightDrive.setPower(0);
leftArm.setPower(0);
// Set all motors to run without encoders.
// May want to use RUN_USING_ENCODERS if encoders are installed.
leftDrive.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
rightDrive.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
leftArm.setMode(DcMotor.RunMode.RUN_WITHOUT_ENCODER);
// Define and initialize ALL installed servos.
leftClaw = hwMap.get(Servo.class, "left_hand");
rightClaw = hwMap.get(Servo.class, "right_hand");
leftClaw.setPosition(MID_SERVO);
rightClaw.setPosition(MID_SERVO);
}
}

View File

@ -0,0 +1,184 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of driving a path based on encoder counts.
* It uses the common Pushbot hardware class to define the drive on the robot.
* The code is structured as a LinearOpMode
*
* The code REQUIRES that you DO have encoders on the wheels,
* otherwise you would use: PushbotAutoDriveByTime;
*
* This code ALSO requires that the drive Motors have been configured such that a positive
* power command moves them forwards, and causes the encoders to count UP.
*
* The desired path in this example is:
* - Drive forward for 48 inches
* - Spin right for 12 Inches
* - Drive Backwards for 24 inches
* - Stop and close the claw.
*
* The code is written using a method called: encoderDrive(speed, leftInches, rightInches, timeoutS)
* that performs the actual movement.
* This methods assumes that each movement is relative to the last stopping place.
* There are other ways to perform encoder based moves, but this method is probably the simplest.
* This code uses the RUN_TO_POSITION mode to enable the Motor controllers to generate the run profile
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@Autonomous(name="Pushbot: Auto Drive By Encoder", group="Pushbot")
@Disabled
public class PushbotAutoDriveByEncoder_Linear extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
private ElapsedTime runtime = new ElapsedTime();
static final double COUNTS_PER_MOTOR_REV = 1440 ; // eg: TETRIX Motor Encoder
static final double DRIVE_GEAR_REDUCTION = 2.0 ; // This is < 1.0 if geared UP
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
static final double DRIVE_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
/*
* Initialize the drive system variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Resetting Encoders"); //
telemetry.update();
robot.leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
robot.leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Send telemetry message to indicate successful Encoder reset
telemetry.addData("Path0", "Starting at %7d :%7d",
robot.leftDrive.getCurrentPosition(),
robot.rightDrive.getCurrentPosition());
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path,
// Note: Reverse movement is obtained by setting a negative distance (not speed)
encoderDrive(DRIVE_SPEED, 48, 48, 5.0); // S1: Forward 47 Inches with 5 Sec timeout
encoderDrive(TURN_SPEED, 12, -12, 4.0); // S2: Turn Right 12 Inches with 4 Sec timeout
encoderDrive(DRIVE_SPEED, -24, -24, 4.0); // S3: Reverse 24 Inches with 4 Sec timeout
robot.leftClaw.setPosition(1.0); // S4: Stop and close the claw.
robot.rightClaw.setPosition(0.0);
sleep(1000); // pause for servos to move
telemetry.addData("Path", "Complete");
telemetry.update();
}
/*
* Method to perform a relative move, based on encoder counts.
* Encoders are not reset as the move is based on the current position.
* Move will stop if any of three conditions occur:
* 1) Move gets to the desired position
* 2) Move runs out of time
* 3) Driver stops the opmode running.
*/
public void encoderDrive(double speed,
double leftInches, double rightInches,
double timeoutS) {
int newLeftTarget;
int newRightTarget;
// Ensure that the opmode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
newLeftTarget = robot.leftDrive.getCurrentPosition() + (int)(leftInches * COUNTS_PER_INCH);
newRightTarget = robot.rightDrive.getCurrentPosition() + (int)(rightInches * COUNTS_PER_INCH);
robot.leftDrive.setTargetPosition(newLeftTarget);
robot.rightDrive.setTargetPosition(newRightTarget);
// Turn On RUN_TO_POSITION
robot.leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// reset the timeout time and start motion.
runtime.reset();
robot.leftDrive.setPower(Math.abs(speed));
robot.rightDrive.setPower(Math.abs(speed));
// keep looping while we are still active, and there is time left, and both motors are running.
// Note: We use (isBusy() && isBusy()) in the loop test, which means that when EITHER motor hits
// its target position, the motion will stop. This is "safer" in the event that the robot will
// always end the motion as soon as possible.
// However, if you require that BOTH motors have finished their moves before the robot continues
// onto the next step, use (isBusy() || isBusy()) in the loop test.
while (opModeIsActive() &&
(runtime.seconds() < timeoutS) &&
(robot.leftDrive.isBusy() && robot.rightDrive.isBusy())) {
// Display it for the driver.
telemetry.addData("Path1", "Running to %7d :%7d", newLeftTarget, newRightTarget);
telemetry.addData("Path2", "Running at %7d :%7d",
robot.leftDrive.getCurrentPosition(),
robot.rightDrive.getCurrentPosition());
telemetry.update();
}
// Stop all motion;
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
// Turn off RUN_TO_POSITION
robot.leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// sleep(250); // optional pause after each move
}
}
}

View File

@ -0,0 +1,363 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.util.ElapsedTime;
import com.qualcomm.robotcore.util.Range;
/**
* This file illustrates the concept of driving a path based on Gyro heading and encoder counts.
* It uses the common Pushbot hardware class to define the drive on the robot.
* The code is structured as a LinearOpMode
*
* The code REQUIRES that you DO have encoders on the wheels,
* otherwise you would use: PushbotAutoDriveByTime;
*
* This code ALSO requires that you have a Modern Robotics I2C gyro with the name "gyro"
* otherwise you would use: PushbotAutoDriveByEncoder;
*
* This code requires that the drive Motors have been configured such that a positive
* power command moves them forward, and causes the encoders to count UP.
*
* This code uses the RUN_TO_POSITION mode to enable the Motor controllers to generate the run profile
*
* In order to calibrate the Gyro correctly, the robot must remain stationary during calibration.
* This is performed when the INIT button is pressed on the Driver Station.
* This code assumes that the robot is stationary when the INIT button is pressed.
* If this is not the case, then the INIT should be performed again.
*
* Note: in this example, all angles are referenced to the initial coordinate frame set during the
* the Gyro Calibration process, or whenever the program issues a resetZAxisIntegrator() call on the Gyro.
*
* The angle of movement/rotation is assumed to be a standardized rotation around the robot Z axis,
* which means that a Positive rotation is Counter Clock Wise, looking down on the field.
* This is consistent with the FTC field coordinate conventions set out in the document:
* ftc_app\doc\tutorial\FTC_FieldCoordinateSystemDefinition.pdf
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@Autonomous(name="Pushbot: Auto Drive By Gyro", group="Pushbot")
@Disabled
public class PushbotAutoDriveByGyro_Linear extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
ModernRoboticsI2cGyro gyro = null; // Additional Gyro device
static final double COUNTS_PER_MOTOR_REV = 1440 ; // eg: TETRIX Motor Encoder
static final double DRIVE_GEAR_REDUCTION = 2.0 ; // This is < 1.0 if geared UP
static final double WHEEL_DIAMETER_INCHES = 4.0 ; // For figuring circumference
static final double COUNTS_PER_INCH = (COUNTS_PER_MOTOR_REV * DRIVE_GEAR_REDUCTION) /
(WHEEL_DIAMETER_INCHES * 3.1415);
// These constants define the desired driving/control characteristics
// The can/should be tweaked to suite the specific robot drive train.
static final double DRIVE_SPEED = 0.7; // Nominal speed for better accuracy.
static final double TURN_SPEED = 0.5; // Nominal half speed for better accuracy.
static final double HEADING_THRESHOLD = 1 ; // As tight as we can make it with an integer gyro
static final double P_TURN_COEFF = 0.1; // Larger is more responsive, but also less stable
static final double P_DRIVE_COEFF = 0.15; // Larger is more responsive, but also less stable
@Override
public void runOpMode() {
/*
* Initialize the standard drive system variables.
* The init() method of the hardware class does most of the work here
*/
robot.init(hardwareMap);
gyro = (ModernRoboticsI2cGyro)hardwareMap.gyroSensor.get("gyro");
// Ensure the robot it stationary, then reset the encoders and calibrate the gyro.
robot.leftDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.STOP_AND_RESET_ENCODER);
// Send telemetry message to alert driver that we are calibrating;
telemetry.addData(">", "Calibrating Gyro"); //
telemetry.update();
gyro.calibrate();
// make sure the gyro is calibrated before continuing
while (!isStopRequested() && gyro.isCalibrating()) {
sleep(50);
idle();
}
telemetry.addData(">", "Robot Ready."); //
telemetry.update();
robot.leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// Wait for the game to start (Display Gyro value), and reset gyro before we move..
while (!isStarted()) {
telemetry.addData(">", "Robot Heading = %d", gyro.getIntegratedZValue());
telemetry.update();
}
gyro.resetZAxisIntegrator();
// Step through each leg of the path,
// Note: Reverse movement is obtained by setting a negative distance (not speed)
// Put a hold after each turn
gyroDrive(DRIVE_SPEED, 48.0, 0.0); // Drive FWD 48 inches
gyroTurn( TURN_SPEED, -45.0); // Turn CCW to -45 Degrees
gyroHold( TURN_SPEED, -45.0, 0.5); // Hold -45 Deg heading for a 1/2 second
gyroDrive(DRIVE_SPEED, 12.0, -45.0); // Drive FWD 12 inches at 45 degrees
gyroTurn( TURN_SPEED, 45.0); // Turn CW to 45 Degrees
gyroHold( TURN_SPEED, 45.0, 0.5); // Hold 45 Deg heading for a 1/2 second
gyroTurn( TURN_SPEED, 0.0); // Turn CW to 0 Degrees
gyroHold( TURN_SPEED, 0.0, 1.0); // Hold 0 Deg heading for a 1 second
gyroDrive(DRIVE_SPEED,-48.0, 0.0); // Drive REV 48 inches
telemetry.addData("Path", "Complete");
telemetry.update();
}
/**
* Method to drive on a fixed compass bearing (angle), based on encoder counts.
* Move will stop if either of these conditions occur:
* 1) Move gets to the desired position
* 2) Driver stops the opmode running.
*
* @param speed Target speed for forward motion. Should allow for _/- variance for adjusting heading
* @param distance Distance (in inches) to move from current position. Negative distance means move backwards.
* @param angle Absolute Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
*/
public void gyroDrive ( double speed,
double distance,
double angle) {
int newLeftTarget;
int newRightTarget;
int moveCounts;
double max;
double error;
double steer;
double leftSpeed;
double rightSpeed;
// Ensure that the opmode is still active
if (opModeIsActive()) {
// Determine new target position, and pass to motor controller
moveCounts = (int)(distance * COUNTS_PER_INCH);
newLeftTarget = robot.leftDrive.getCurrentPosition() + moveCounts;
newRightTarget = robot.rightDrive.getCurrentPosition() + moveCounts;
// Set Target and Turn On RUN_TO_POSITION
robot.leftDrive.setTargetPosition(newLeftTarget);
robot.rightDrive.setTargetPosition(newRightTarget);
robot.leftDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_TO_POSITION);
// start motion.
speed = Range.clip(Math.abs(speed), 0.0, 1.0);
robot.leftDrive.setPower(speed);
robot.rightDrive.setPower(speed);
// keep looping while we are still active, and BOTH motors are running.
while (opModeIsActive() &&
(robot.leftDrive.isBusy() && robot.rightDrive.isBusy())) {
// adjust relative speed based on heading error.
error = getError(angle);
steer = getSteer(error, P_DRIVE_COEFF);
// if driving in reverse, the motor correction also needs to be reversed
if (distance < 0)
steer *= -1.0;
leftSpeed = speed - steer;
rightSpeed = speed + steer;
// Normalize speeds if either one exceeds +/- 1.0;
max = Math.max(Math.abs(leftSpeed), Math.abs(rightSpeed));
if (max > 1.0)
{
leftSpeed /= max;
rightSpeed /= max;
}
robot.leftDrive.setPower(leftSpeed);
robot.rightDrive.setPower(rightSpeed);
// Display drive status for the driver.
telemetry.addData("Err/St", "%5.1f/%5.1f", error, steer);
telemetry.addData("Target", "%7d:%7d", newLeftTarget, newRightTarget);
telemetry.addData("Actual", "%7d:%7d", robot.leftDrive.getCurrentPosition(),
robot.rightDrive.getCurrentPosition());
telemetry.addData("Speed", "%5.2f:%5.2f", leftSpeed, rightSpeed);
telemetry.update();
}
// Stop all motion;
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
// Turn off RUN_TO_POSITION
robot.leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
robot.rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
}
}
/**
* Method to spin on central axis to point in a new direction.
* Move will stop if either of these conditions occur:
* 1) Move gets to the heading (angle)
* 2) Driver stops the opmode running.
*
* @param speed Desired speed of turn.
* @param angle Absolute Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
*/
public void gyroTurn ( double speed, double angle) {
// keep looping while we are still active, and not on heading.
while (opModeIsActive() && !onHeading(speed, angle, P_TURN_COEFF)) {
// Update telemetry & Allow time for other processes to run.
telemetry.update();
}
}
/**
* Method to obtain & hold a heading for a finite amount of time
* Move will stop once the requested time has elapsed
*
* @param speed Desired speed of turn.
* @param angle Absolute Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
* @param holdTime Length of time (in seconds) to hold the specified heading.
*/
public void gyroHold( double speed, double angle, double holdTime) {
ElapsedTime holdTimer = new ElapsedTime();
// keep looping while we have time remaining.
holdTimer.reset();
while (opModeIsActive() && (holdTimer.time() < holdTime)) {
// Update telemetry & Allow time for other processes to run.
onHeading(speed, angle, P_TURN_COEFF);
telemetry.update();
}
// Stop all motion;
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
}
/**
* Perform one cycle of closed loop heading control.
*
* @param speed Desired speed of turn.
* @param angle Absolute Angle (in Degrees) relative to last gyro reset.
* 0 = fwd. +ve is CCW from fwd. -ve is CW from forward.
* If a relative angle is required, add/subtract from current heading.
* @param PCoeff Proportional Gain coefficient
* @return
*/
boolean onHeading(double speed, double angle, double PCoeff) {
double error ;
double steer ;
boolean onTarget = false ;
double leftSpeed;
double rightSpeed;
// determine turn power based on +/- error
error = getError(angle);
if (Math.abs(error) <= HEADING_THRESHOLD) {
steer = 0.0;
leftSpeed = 0.0;
rightSpeed = 0.0;
onTarget = true;
}
else {
steer = getSteer(error, PCoeff);
rightSpeed = speed * steer;
leftSpeed = -rightSpeed;
}
// Send desired speeds to motors.
robot.leftDrive.setPower(leftSpeed);
robot.rightDrive.setPower(rightSpeed);
// Display it for the driver.
telemetry.addData("Target", "%5.2f", angle);
telemetry.addData("Err/St", "%5.2f/%5.2f", error, steer);
telemetry.addData("Speed.", "%5.2f:%5.2f", leftSpeed, rightSpeed);
return onTarget;
}
/**
* getError determines the error between the target angle and the robot's current heading
* @param targetAngle Desired angle (relative to global reference established at last Gyro Reset).
* @return error angle: Degrees in the range +/- 180. Centered on the robot's frame of reference
* +ve error means the robot should turn LEFT (CCW) to reduce error.
*/
public double getError(double targetAngle) {
double robotError;
// calculate error in -179 to +180 range (
robotError = targetAngle - gyro.getIntegratedZValue();
while (robotError > 180) robotError -= 360;
while (robotError <= -180) robotError += 360;
return robotError;
}
/**
* returns desired steering force. +/- 1 range. +ve = steer left
* @param error Error angle in robot relative degrees
* @param PCoeff Proportional Gain Coefficient
* @return
*/
public double getSteer(double error, double PCoeff) {
return Range.clip(error * PCoeff, -1, 1);
}
}

View File

@ -0,0 +1,125 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.util.ElapsedTime;
/**
* This file illustrates the concept of driving a path based on time.
* It uses the common Pushbot hardware class to define the drive on the robot.
* The code is structured as a LinearOpMode
*
* The code assumes that you do NOT have encoders on the wheels,
* otherwise you would use: PushbotAutoDriveByEncoder;
*
* The desired path in this example is:
* - Drive forward for 3 seconds
* - Spin right for 1.3 seconds
* - Drive Backwards for 1 Second
* - Stop and close the claw.
*
* The code is written in a simple form with no optimizations.
* However, there are several ways that this type of sequence could be streamlined,
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@Autonomous(name="Pushbot: Auto Drive By Time", group="Pushbot")
@Disabled
public class PushbotAutoDriveByTime_Linear extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
private ElapsedTime runtime = new ElapsedTime();
static final double FORWARD_SPEED = 0.6;
static final double TURN_SPEED = 0.5;
@Override
public void runOpMode() {
/*
* Initialize the drive system variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to run"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// Step through each leg of the path, ensuring that the Auto mode has not been stopped along the way
// Step 1: Drive forward for 3 seconds
robot.leftDrive.setPower(FORWARD_SPEED);
robot.rightDrive.setPower(FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 3.0)) {
telemetry.addData("Path", "Leg 1: %2.5f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 2: Spin right for 1.3 seconds
robot.leftDrive.setPower(TURN_SPEED);
robot.rightDrive.setPower(-TURN_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.3)) {
telemetry.addData("Path", "Leg 2: %2.5f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 3: Drive Backwards for 1 Second
robot.leftDrive.setPower(-FORWARD_SPEED);
robot.rightDrive.setPower(-FORWARD_SPEED);
runtime.reset();
while (opModeIsActive() && (runtime.seconds() < 1.0)) {
telemetry.addData("Path", "Leg 3: %2.5f S Elapsed", runtime.seconds());
telemetry.update();
}
// Step 4: Stop and close the claw.
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
robot.leftClaw.setPosition(1.0);
robot.rightClaw.setPosition(0.0);
telemetry.addData("Path", "Complete");
telemetry.update();
sleep(1000);
}
}

View File

@ -0,0 +1,118 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Autonomous;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.LightSensor;
/**
* This file illustrates the concept of driving up to a line and then stopping.
* It uses the common Pushbot hardware class to define the drive on the robot.
* The code is structured as a LinearOpMode
*
* The code shows using two different light sensors:
* The Primary sensor shown in this code is a legacy NXT Light sensor (called "sensor_light")
* Alternative "commented out" code uses a MR Optical Distance Sensor (called "sensor_ods")
* instead of the LEGO sensor. Chose to use one sensor or the other.
*
* Setting the correct WHITE_THRESHOLD value is key to stopping correctly.
* This should be set half way between the light and dark values.
* These values can be read on the screen once the OpMode has been INIT, but before it is STARTED.
* Move the senso on asnd off the white line and not the min and max readings.
* Edit this code to make WHITE_THRESHOLD half way between the min and max.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@Autonomous(name="Pushbot: Auto Drive To Line", group="Pushbot")
@Disabled
public class PushbotAutoDriveToLine_Linear extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
LightSensor lightSensor; // Primary LEGO Light sensor,
// OpticalDistanceSensor lightSensor; // Alternative MR ODS sensor
static final double WHITE_THRESHOLD = 0.2; // spans between 0.1 - 0.5 from dark to light
static final double APPROACH_SPEED = 0.5;
@Override
public void runOpMode() {
/* Initialize the drive system variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// If there are encoders connected, switch to RUN_USING_ENCODER mode for greater accuracy
// robot.leftDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// robot.rightDrive.setMode(DcMotor.RunMode.RUN_USING_ENCODER);
// get a reference to our Light Sensor object.
lightSensor = hardwareMap.lightSensor.get("sensor_light"); // Primary LEGO Light Sensor
// lightSensor = hardwareMap.opticalDistanceSensor.get("sensor_ods"); // Alternative MR ODS sensor.
// turn on LED of light sensor.
lightSensor.enableLed(true);
// Send telemetry message to signify robot waiting;
telemetry.addData("Status", "Ready to run"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
// Abort this loop is started or stopped.
while (!(isStarted() || isStopRequested())) {
// Display the light level while we are waiting to start
telemetry.addData("Light Level", lightSensor.getLightDetected());
telemetry.update();
idle();
}
// Start the robot moving forward, and then begin looking for a white line.
robot.leftDrive.setPower(APPROACH_SPEED);
robot.rightDrive.setPower(APPROACH_SPEED);
// run until the white line is seen OR the driver presses STOP;
while (opModeIsActive() && (lightSensor.getLightDetected() < WHITE_THRESHOLD)) {
// Display the light level while we are looking for the line
telemetry.addData("Light Level", lightSensor.getLightDetected());
telemetry.update();
}
// Stop all motors
robot.leftDrive.setPower(0);
robot.rightDrive.setPower(0);
}
}

View File

@ -0,0 +1,134 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
/**
* This OpMode uses the common Pushbot hardware class to define the devices on the robot.
* All device access is managed through the HardwarePushbot class.
* The code is structured as a LinearOpMode
*
* This particular OpMode executes a POV Game style Teleop for a PushBot
* In this mode the left stick moves the robot FWD and back, the Right stick turns left and right.
* It raises and lowers the claw using the Gampad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Pushbot: Teleop POV", group="Pushbot")
@Disabled
public class PushbotTeleopPOV_Linear extends LinearOpMode {
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // Use a Pushbot's hardware
double clawOffset = 0; // Servo mid position
final double CLAW_SPEED = 0.02 ; // sets rate to move servo
@Override
public void runOpMode() {
double left;
double right;
double drive;
double turn;
double max;
/* Initialize the hardware variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// Send telemetry message to signify robot waiting;
telemetry.addData("Say", "Hello Driver"); //
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
// Run wheels in POV mode (note: The joystick goes negative when pushed forwards, so negate it)
// In this mode the Left stick moves the robot fwd and back, the Right stick turns left and right.
// This way it's also easy to just drive straight, or just turn.
drive = -gamepad1.left_stick_y;
turn = gamepad1.right_stick_x;
// Combine drive and turn for blended motion.
left = drive + turn;
right = drive - turn;
// Normalize the values so neither exceed +/- 1.0
max = Math.max(Math.abs(left), Math.abs(right));
if (max > 1.0)
{
left /= max;
right /= max;
}
// Output the safe vales to the motor drives.
robot.leftDrive.setPower(left);
robot.rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
robot.leftClaw.setPosition(robot.MID_SERVO + clawOffset);
robot.rightClaw.setPosition(robot.MID_SERVO - clawOffset);
// Use gamepad buttons to move arm up (Y) and down (A)
if (gamepad1.y)
robot.leftArm.setPower(robot.ARM_UP_POWER);
else if (gamepad1.a)
robot.leftArm.setPower(robot.ARM_DOWN_POWER);
else
robot.leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
telemetry.update();
// Pace this loop so jaw action is reasonable speed.
sleep(50);
}
}
}

View File

@ -0,0 +1,135 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.Range;
/**
* This file provides basic Telop driving for a Pushbot robot.
* The code is structured as an Iterative OpMode
*
* This OpMode uses the common Pushbot hardware class to define the devices on the robot.
* All device access is managed through the HardwarePushbot class.
*
* This particular OpMode executes a basic Tank Drive Teleop for a PushBot
* It raises and lowers the claw using the Gampad Y and A buttons respectively.
* It also opens and closes the claws slowly using the left and right Bumper buttons.
*
* Use Android Studios to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name="Pushbot: Teleop Tank", group="Pushbot")
@Disabled
public class PushbotTeleopTank_Iterative extends OpMode{
/* Declare OpMode members. */
HardwarePushbot robot = new HardwarePushbot(); // use the class created to define a Pushbot's hardware
double clawOffset = 0.0 ; // Servo mid position
final double CLAW_SPEED = 0.02 ; // sets rate to move servo
/*
* Code to run ONCE when the driver hits INIT
*/
@Override
public void init() {
/* Initialize the hardware variables.
* The init() method of the hardware class does all the work here
*/
robot.init(hardwareMap);
// Send telemetry message to signify robot waiting;
telemetry.addData("Say", "Hello Driver"); //
}
/*
* Code to run REPEATEDLY after the driver hits INIT, but before they hit PLAY
*/
@Override
public void init_loop() {
}
/*
* Code to run ONCE when the driver hits PLAY
*/
@Override
public void start() {
}
/*
* Code to run REPEATEDLY after the driver hits PLAY but before they hit STOP
*/
@Override
public void loop() {
double left;
double right;
// Run wheels in tank mode (note: The joystick goes negative when pushed forwards, so negate it)
left = -gamepad1.left_stick_y;
right = -gamepad1.right_stick_y;
robot.leftDrive.setPower(left);
robot.rightDrive.setPower(right);
// Use gamepad left & right Bumpers to open and close the claw
if (gamepad1.right_bumper)
clawOffset += CLAW_SPEED;
else if (gamepad1.left_bumper)
clawOffset -= CLAW_SPEED;
// Move both servos to new position. Assume servos are mirror image of each other.
clawOffset = Range.clip(clawOffset, -0.5, 0.5);
robot.leftClaw.setPosition(robot.MID_SERVO + clawOffset);
robot.rightClaw.setPosition(robot.MID_SERVO - clawOffset);
// Use gamepad buttons to move the arm up (Y) and down (A)
if (gamepad1.y)
robot.leftArm.setPower(robot.ARM_UP_POWER);
else if (gamepad1.a)
robot.leftArm.setPower(robot.ARM_DOWN_POWER);
else
robot.leftArm.setPower(0.0);
// Send telemetry message to signify robot running;
telemetry.addData("claw", "Offset = %.2f", clawOffset);
telemetry.addData("left", "%.2f", left);
telemetry.addData("right", "%.2f", right);
}
/*
* Code to run ONCE after the driver hits STOP
*/
@Override
public void stop() {
}
}

View File

@ -0,0 +1,164 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.RevBlinkinLedDriver;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Telemetry;
import org.firstinspires.ftc.robotcore.internal.system.Deadline;
import java.util.concurrent.TimeUnit;
/*
* Display patterns of a REV Robotics Blinkin LED Driver.
* AUTO mode cycles through all of the patterns.
* MANUAL mode allows the user to manually change patterns using the
* left and right bumpers of a gamepad.
*
* Configure the driver on a servo port, and name it "blinkin".
*
* Displays the first pattern upon init.
*/
@TeleOp(name="BlinkinExample")
@Disabled
public class SampleRevBlinkinLedDriver extends OpMode {
/*
* Change the pattern every 10 seconds in AUTO mode.
*/
private final static int LED_PERIOD = 10;
/*
* Rate limit gamepad button presses to every 500ms.
*/
private final static int GAMEPAD_LOCKOUT = 500;
RevBlinkinLedDriver blinkinLedDriver;
RevBlinkinLedDriver.BlinkinPattern pattern;
Telemetry.Item patternName;
Telemetry.Item display;
DisplayKind displayKind;
Deadline ledCycleDeadline;
Deadline gamepadRateLimit;
protected enum DisplayKind {
MANUAL,
AUTO
}
@Override
public void init()
{
displayKind = DisplayKind.AUTO;
blinkinLedDriver = hardwareMap.get(RevBlinkinLedDriver.class, "blinkin");
pattern = RevBlinkinLedDriver.BlinkinPattern.RAINBOW_RAINBOW_PALETTE;
blinkinLedDriver.setPattern(pattern);
display = telemetry.addData("Display Kind: ", displayKind.toString());
patternName = telemetry.addData("Pattern: ", pattern.toString());
ledCycleDeadline = new Deadline(LED_PERIOD, TimeUnit.SECONDS);
gamepadRateLimit = new Deadline(GAMEPAD_LOCKOUT, TimeUnit.MILLISECONDS);
}
@Override
public void loop()
{
handleGamepad();
if (displayKind == DisplayKind.AUTO) {
doAutoDisplay();
} else {
/*
* MANUAL mode: Nothing to do, setting the pattern as a result of a gamepad event.
*/
}
}
/*
* handleGamepad
*
* Responds to a gamepad button press. Demonstrates rate limiting for
* button presses. If loop() is called every 10ms and and you don't rate
* limit, then any given button press may register as multiple button presses,
* which in this application is problematic.
*
* A: Manual mode, Right bumper displays the next pattern, left bumper displays the previous pattern.
* B: Auto mode, pattern cycles, changing every LED_PERIOD seconds.
*/
protected void handleGamepad()
{
if (!gamepadRateLimit.hasExpired()) {
return;
}
if (gamepad1.a) {
setDisplayKind(DisplayKind.MANUAL);
gamepadRateLimit.reset();
} else if (gamepad1.b) {
setDisplayKind(DisplayKind.AUTO);
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.left_bumper)) {
pattern = pattern.previous();
displayPattern();
gamepadRateLimit.reset();
} else if ((displayKind == DisplayKind.MANUAL) && (gamepad1.right_bumper)) {
pattern = pattern.next();
displayPattern();
gamepadRateLimit.reset();
}
}
protected void setDisplayKind(DisplayKind displayKind)
{
this.displayKind = displayKind;
display.setValue(displayKind.toString());
}
protected void doAutoDisplay()
{
if (ledCycleDeadline.hasExpired()) {
pattern = pattern.next();
displayPattern();
ledCycleDeadline.reset();
}
}
protected void displayPattern()
{
blinkinLedDriver.setPattern(pattern);
patternName.setValue(pattern.toString());
}
}

View File

@ -0,0 +1,167 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.ColorSensor;
import com.qualcomm.robotcore.hardware.DeviceInterfaceModule;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
*
* This is an example LinearOpMode that shows how to use
* the Adafruit RGB Sensor. It assumes that the I2C
* cable for the sensor is connected to an I2C port on the
* Core Device Interface Module.
*
* It also assuems that the LED pin of the sensor is connected
* to the digital signal pin of a digital port on the
* Core Device Interface Module.
*
* You can use the digital port to turn the sensor's onboard
* LED on or off.
*
* The op mode assumes that the Core Device Interface Module
* is configured with a name of "dim" and that the Adafruit color sensor
* is configured as an I2C device with a name of "sensor_color".
*
* It also assumes that the LED pin of the RGB sensor
* is connected to the signal pin of digital port #5 (zero indexed)
* of the Core Device Interface Module.
*
* You can use the X button on gamepad1 to toggle the LED on and off.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: AdafruitRGB", group = "Sensor")
@Disabled // Comment this out to add to the opmode list
public class SensorAdafruitRGB extends LinearOpMode {
ColorSensor sensorRGB;
DeviceInterfaceModule cdim;
// we assume that the LED pin of the RGB sensor is connected to
// digital port 5 (zero indexed).
static final int LED_CHANNEL = 5;
@Override
public void runOpMode() {
// hsvValues is an array that will hold the hue, saturation, and value information.
float hsvValues[] = {0F,0F,0F};
// values is a reference to the hsvValues array.
final float values[] = hsvValues;
// get a reference to the RelativeLayout so we can change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
final View relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
// bPrevState and bCurrState represent the previous and current state of the button.
boolean bPrevState = false;
boolean bCurrState = false;
// bLedOn represents the state of the LED.
boolean bLedOn = true;
// get a reference to our DeviceInterfaceModule object.
cdim = hardwareMap.deviceInterfaceModule.get("dim");
// set the digital channel to output mode.
// remember, the Adafruit sensor is actually two devices.
// It's an I2C sensor and it's also an LED that can be turned on or off.
cdim.setDigitalChannelMode(LED_CHANNEL, DigitalChannel.Mode.OUTPUT);
// get a reference to our ColorSensor object.
sensorRGB = hardwareMap.colorSensor.get("sensor_color");
// turn the LED on in the beginning, just so user will know that the sensor is active.
cdim.setDigitalChannelState(LED_CHANNEL, bLedOn);
// wait for the start button to be pressed.
waitForStart();
// loop and read the RGB data.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// check the status of the x button on gamepad.
bCurrState = gamepad1.x;
// check for button-press state transitions.
if ((bCurrState == true) && (bCurrState != bPrevState)) {
// button is transitioning to a pressed state. Toggle the LED.
bLedOn = !bLedOn;
cdim.setDigitalChannelState(LED_CHANNEL, bLedOn);
}
// update previous state variable.
bPrevState = bCurrState;
// convert the RGB values to HSV values.
Color.RGBToHSV((sensorRGB.red() * 255) / 800, (sensorRGB.green() * 255) / 800, (sensorRGB.blue() * 255) / 800, hsvValues);
// send the info back to driver station using telemetry function.
telemetry.addData("LED", bLedOn ? "On" : "Off");
telemetry.addData("Clear", sensorRGB.alpha());
telemetry.addData("Red ", sensorRGB.red());
telemetry.addData("Green", sensorRGB.green());
telemetry.addData("Blue ", sensorRGB.blue());
telemetry.addData("Hue", hsvValues[0]);
// change the background color to match the color detected by the RGB sensor.
// pass a reference to the hue, saturation, and value array as an argument
// to the HSVToColor method.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(0xff, values));
}
});
telemetry.update();
}
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}

View File

@ -0,0 +1,184 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.hardware.bosch.JustLoggingAccelerationIntegrator;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.external.navigation.Position;
import org.firstinspires.ftc.robotcore.external.navigation.Velocity;
import java.util.Locale;
/**
* {@link SensorBNO055IMU} gives a short demo on how to use the BNO055 Inertial Motion Unit (IMU) from AdaFruit.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* @see <a href="http://www.adafruit.com/products/2472">Adafruit IMU</a>
*/
@TeleOp(name = "Sensor: BNO055 IMU", group = "Sensor")
@Disabled // Comment this out to add to the opmode list
public class SensorBNO055IMU extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// The IMU sensor object
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
Acceleration gravity;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
// Set up the parameters with which we will use our IMU. Note that integration
// algorithm here just reports accelerations to the logcat log; it doesn't actually
// provide positional information.
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.angleUnit = BNO055IMU.AngleUnit.DEGREES;
parameters.accelUnit = BNO055IMU.AccelUnit.METERS_PERSEC_PERSEC;
parameters.calibrationDataFile = "BNO055IMUCalibration.json"; // see the calibration sample opmode
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
parameters.accelerationIntegrationAlgorithm = new JustLoggingAccelerationIntegrator();
// Retrieve and initialize the IMU. We expect the IMU to be attached to an I2C port
// on a Core Device Interface Module, configured to be a sensor of type "AdaFruit IMU",
// and named "imu".
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
// Set up our telemetry dashboard
composeTelemetry();
// Wait until we're told to go
waitForStart();
// Start the logging of measured acceleration
imu.startAccelerationIntegration(new Position(), new Velocity(), 1000);
// Loop and update the dashboard
while (opModeIsActive()) {
telemetry.update();
}
}
//----------------------------------------------------------------------------------------------
// Telemetry Configuration
//----------------------------------------------------------------------------------------------
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
gravity = imu.getGravity();
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
telemetry.addLine()
.addData("grvty", new Func<String>() {
@Override public String value() {
return gravity.toString();
}
})
.addData("mag", new Func<String>() {
@Override public String value() {
return String.format(Locale.getDefault(), "%.3f",
Math.sqrt(gravity.xAccel*gravity.xAccel
+ gravity.yAccel*gravity.yAccel
+ gravity.zAccel*gravity.zAccel));
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,232 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.adafruit.AdafruitBNO055IMU;
import com.qualcomm.hardware.bosch.BNO055IMU;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.util.ReadWriteFile;
import org.firstinspires.ftc.robotcore.external.Func;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import java.io.File;
import java.util.Locale;
/**
* {@link SensorBNO055IMUCalibration} calibrates the IMU accelerometer per
* "Section 3.11 Calibration" of the BNO055 specification.
*
* <p>Manual calibration of the IMU is definitely NOT necessary: except for the magnetometer
* (which is not used by the default {@link BNO055IMU.SensorMode#IMU
* SensorMode#IMU}), the BNO055 is internally self-calibrating and thus can be very successfully
* used without manual intervention. That said, performing a one-time calibration, saving the
* results persistently, then loading them again at each run can help reduce the time that automatic
* calibration requires.</p>
*
* <p>This summary of the calibration process, from <a href="http://iotdk.intel.com/docs/master/upm/classupm_1_1_b_n_o055.html">
* Intel</a>, is informative:</p>
*
* <p>"This device requires calibration in order to operate accurately. [...] Calibration data is
* lost on a power cycle. See one of the examples for a description of how to calibrate the device,
* but in essence:</p>
*
* <p>There is a calibration status register available [...] that returns the calibration status
* of the accelerometer (ACC), magnetometer (MAG), gyroscope (GYR), and overall system (SYS).
* Each of these values range from 0 (uncalibrated) to 3 (fully calibrated). Calibration [ideally]
* involves certain motions to get all 4 values at 3. The motions are as follows (though see the
* datasheet for more information):</p>
*
* <li>
* <ol>GYR: Simply let the sensor sit flat for a few seconds.</ol>
* <ol>ACC: Move the sensor in various positions. Start flat, then rotate slowly by 45
* degrees, hold for a few seconds, then continue rotating another 45 degrees and
* hold, etc. 6 or more movements of this type may be required. You can move through
* any axis you desire, but make sure that the device is lying at least once
* perpendicular to the x, y, and z axis.</ol>
* <ol>MAG: Move slowly in a figure 8 pattern in the air, until the calibration values reaches 3.</ol>
* <ol>SYS: This will usually reach 3 when the other items have also reached 3. If not, continue
* slowly moving the device though various axes until it does."</ol>
* </li>
*
* <p>To calibrate the IMU, run this sample opmode with a gamepad attached to the driver station.
* Once the IMU has reached sufficient calibration as reported on telemetry, press the 'A'
* button on the gamepad to write the calibration to a file. That file can then be indicated
* later when running an opmode which uses the IMU.</p>
*
* <p>Note: if your intended uses of the IMU do not include use of all its sensors (for exmaple,
* you might not use the magnetometer), then it makes little sense for you to wait for full
* calibration of the sensors you are not using before saving the calibration data. Indeed,
* it appears that in a SensorMode that doesn't use the magnetometer (for example), the
* magnetometer cannot actually be calibrated.</p>
*
* @see AdafruitBNO055IMU
* @see BNO055IMU.Parameters#calibrationDataFile
* @see <a href="https://www.bosch-sensortec.com/bst/products/all_products/bno055">BNO055 product page</a>
* @see <a href="https://ae-bst.resource.bosch.com/media/_tech/media/datasheets/BST_BNO055_DS000_14.pdf">BNO055 specification</a>
*/
@TeleOp(name = "Sensor: BNO055 IMU Calibration", group = "Sensor")
@Disabled // Uncomment this to add to the opmode list
public class SensorBNO055IMUCalibration extends LinearOpMode
{
//----------------------------------------------------------------------------------------------
// State
//----------------------------------------------------------------------------------------------
// Our sensors, motors, and other devices go here, along with other long term state
BNO055IMU imu;
// State used for updating telemetry
Orientation angles;
//----------------------------------------------------------------------------------------------
// Main logic
//----------------------------------------------------------------------------------------------
@Override public void runOpMode() {
telemetry.log().setCapacity(12);
telemetry.log().add("");
telemetry.log().add("Please refer to the calibration instructions");
telemetry.log().add("contained in the Adafruit IMU calibration");
telemetry.log().add("sample opmode.");
telemetry.log().add("");
telemetry.log().add("When sufficient calibration has been reached,");
telemetry.log().add("press the 'A' button to write the current");
telemetry.log().add("calibration data to a file.");
telemetry.log().add("");
// We are expecting the IMU to be attached to an I2C port on a Core Device Interface Module and named "imu".
BNO055IMU.Parameters parameters = new BNO055IMU.Parameters();
parameters.loggingEnabled = true;
parameters.loggingTag = "IMU";
imu = hardwareMap.get(BNO055IMU.class, "imu");
imu.initialize(parameters);
composeTelemetry();
telemetry.log().add("Waiting for start...");
// Wait until we're told to go
while (!isStarted()) {
telemetry.update();
idle();
}
telemetry.log().add("...started...");
while (opModeIsActive()) {
if (gamepad1.a) {
// Get the calibration data
BNO055IMU.CalibrationData calibrationData = imu.readCalibrationData();
// Save the calibration data to a file. You can choose whatever file
// name you wish here, but you'll want to indicate the same file name
// when you initialize the IMU in an opmode in which it is used. If you
// have more than one IMU on your robot, you'll of course want to use
// different configuration file names for each.
String filename = "AdafruitIMUCalibration.json";
File file = AppUtil.getInstance().getSettingsFile(filename);
ReadWriteFile.writeFile(file, calibrationData.serialize());
telemetry.log().add("saved to '%s'", filename);
// Wait for the button to be released
while (gamepad1.a) {
telemetry.update();
idle();
}
}
telemetry.update();
}
}
void composeTelemetry() {
// At the beginning of each telemetry update, grab a bunch of data
// from the IMU that we will then display in separate lines.
telemetry.addAction(new Runnable() { @Override public void run()
{
// Acquiring the angles is relatively expensive; we don't want
// to do that in each of the three items that need that info, as that's
// three times the necessary expense.
angles = imu.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
}
});
telemetry.addLine()
.addData("status", new Func<String>() {
@Override public String value() {
return imu.getSystemStatus().toShortString();
}
})
.addData("calib", new Func<String>() {
@Override public String value() {
return imu.getCalibrationStatus().toString();
}
});
telemetry.addLine()
.addData("heading", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.firstAngle);
}
})
.addData("roll", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.secondAngle);
}
})
.addData("pitch", new Func<String>() {
@Override public String value() {
return formatAngle(angles.angleUnit, angles.thirdAngle);
}
});
}
//----------------------------------------------------------------------------------------------
// Formatting
//----------------------------------------------------------------------------------------------
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format(Locale.getDefault(), "%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,223 @@
/* Copyright (c) 2017-2020 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
import com.qualcomm.robotcore.hardware.NormalizedRGBA;
import com.qualcomm.robotcore.hardware.SwitchableLight;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* This is an example LinearOpMode that shows how to use a color sensor in a generic
* way, regardless of which particular make or model of color sensor is used. The Op Mode
* assumes that the color sensor is configured with a name of "sensor_color".
*
* There will be some variation in the values measured depending on the specific sensor you are using.
*
* You can increase the gain (a multiplier to make the sensor report higher values) by holding down
* the A button on the gamepad, and decrease the gain by holding down the B button on the gamepad.
*
* If the color sensor has a light which is controllable from software, you can use the X button on
* the gamepad to toggle the light on and off. The REV sensors don't support this, but instead have
* a physical switch on them to turn the light on and off, beginning with REV Color Sensor V2.
*
* If the color sensor also supports short-range distance measurements (usually via an infrared
* proximity sensor), the reported distance will be written to telemetry. As of September 2020,
* the only color sensors that support this are the ones from REV Robotics. These infrared proximity
* sensor measurements are only useful at very small distances, and are sensitive to ambient light
* and surface reflectivity. You should use a different sensor if you need precise distance measurements.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this Op Mode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: Color", group = "Sensor")
@Disabled
public class SensorColor extends LinearOpMode {
/** The colorSensor field will contain a reference to our color sensor hardware object */
NormalizedColorSensor colorSensor;
/** The relativeLayout field is used to aid in providing interesting visual feedback
* in this sample application; you probably *don't* need this when you use a color sensor on your
* robot. Note that you won't see anything change on the Driver Station, only on the Robot Controller. */
View relativeLayout;
/**
* The runOpMode() method is the root of this Op Mode, as it is in all LinearOpModes.
* Our implementation here, though is a bit unusual: we've decided to put all the actual work
* in the runSample() method rather than directly in runOpMode() itself. The reason we do that is
* that in this sample we're changing the background color of the robot controller screen as the
* Op Mode runs, and we want to be able to *guarantee* that we restore it to something reasonable
* and palatable when the Op Mode ends. The simplest way to do that is to use a try...finally
* block around the main, core logic, and an easy way to make that all clear was to separate
* the former from the latter in separate methods.
*/
@Override public void runOpMode() {
// Get a reference to the RelativeLayout so we can later change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
try {
runSample(); // actually execute the sample
} finally {
// On the way out, *guarantee* that the background is reasonable. It doesn't actually start off
// as pure white, but it's too much work to dig out what actually was used, and this is good
// enough to at least make the screen reasonable again.
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}
protected void runSample() {
// You can give the sensor a gain value, will be multiplied by the sensor's raw value before the
// normalized color values are calculated. Color sensors (especially the REV Color Sensor V3)
// can give very low values (depending on the lighting conditions), which only use a small part
// of the 0-1 range that is available for the red, green, and blue values. In brighter conditions,
// you should use a smaller gain than in dark conditions. If your gain is too high, all of the
// colors will report at or near 1, and you won't be able to determine what color you are
// actually looking at. For this reason, it's better to err on the side of a lower gain
// (but always greater than or equal to 1).
float gain = 2;
// Once per loop, we will update this hsvValues array. The first element (0) will contain the
// hue, the second element (1) will contain the saturation, and the third element (2) will
// contain the value. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
// for an explanation of HSV color.
final float[] hsvValues = new float[3];
// xButtonPreviouslyPressed and xButtonCurrentlyPressed keep track of the previous and current
// state of the X button on the gamepad
boolean xButtonPreviouslyPressed = false;
boolean xButtonCurrentlyPressed = false;
// Get a reference to our sensor object. It's recommended to use NormalizedColorSensor over
// ColorSensor, because NormalizedColorSensor consistently gives values between 0 and 1, while
// the values you get from ColorSensor are dependent on the specific sensor you're using.
colorSensor = hardwareMap.get(NormalizedColorSensor.class, "sensor_color");
// If possible, turn the light on in the beginning (it might already be on anyway,
// we just make sure it is if we can).
if (colorSensor instanceof SwitchableLight) {
((SwitchableLight)colorSensor).enableLight(true);
}
// Wait for the start button to be pressed.
waitForStart();
// Loop until we are asked to stop
while (opModeIsActive()) {
// Explain basic gain information via telemetry
telemetry.addLine("Hold the A button on gamepad 1 to increase gain, or B to decrease it.\n");
telemetry.addLine("Higher gain values mean that the sensor will report larger numbers for Red, Green, and Blue, and Value\n");
// Update the gain value if either of the A or B gamepad buttons is being held
if (gamepad1.a) {
// Only increase the gain by a small amount, since this loop will occur multiple times per second.
gain += 0.005;
} else if (gamepad1.b && gain > 1) { // A gain of less than 1 will make the values smaller, which is not helpful.
gain -= 0.005;
}
// Show the gain value via telemetry
telemetry.addData("Gain", gain);
// Tell the sensor our desired gain value (normally you would do this during initialization,
// not during the loop)
colorSensor.setGain(gain);
// Check the status of the X button on the gamepad
xButtonCurrentlyPressed = gamepad1.x;
// If the button state is different than what it was, then act
if (xButtonCurrentlyPressed != xButtonPreviouslyPressed) {
// If the button is (now) down, then toggle the light
if (xButtonCurrentlyPressed) {
if (colorSensor instanceof SwitchableLight) {
SwitchableLight light = (SwitchableLight)colorSensor;
light.enableLight(!light.isLightOn());
}
}
}
xButtonPreviouslyPressed = xButtonCurrentlyPressed;
// Get the normalized colors from the sensor
NormalizedRGBA colors = colorSensor.getNormalizedColors();
/* Use telemetry to display feedback on the driver station. We show the red, green, and blue
* normalized values from the sensor (in the range of 0 to 1), as well as the equivalent
* HSV (hue, saturation and value) values. See http://web.archive.org/web/20190311170843/https://infohost.nmt.edu/tcc/help/pubs/colortheory/web/hsv.html
* for an explanation of HSV color. */
// Update the hsvValues array by passing it to Color.colorToHSV()
Color.colorToHSV(colors.toColor(), hsvValues);
telemetry.addLine()
.addData("Red", "%.3f", colors.red)
.addData("Green", "%.3f", colors.green)
.addData("Blue", "%.3f", colors.blue);
telemetry.addLine()
.addData("Hue", "%.3f", hsvValues[0])
.addData("Saturation", "%.3f", hsvValues[1])
.addData("Value", "%.3f", hsvValues[2]);
telemetry.addData("Alpha", "%.3f", colors.alpha);
/* If this color sensor also has a distance sensor, display the measured distance.
* Note that the reported distance is only useful at very close range, and is impacted by
* ambient light and surface reflectivity. */
if (colorSensor instanceof DistanceSensor) {
telemetry.addData("Distance (cm)", "%.3f", ((DistanceSensor) colorSensor).getDistance(DistanceUnit.CM));
}
telemetry.update();
// Change the Robot Controller's background color to match the color detected by the color sensor.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(hsvValues));
}
});
}
}
}

View File

@ -0,0 +1,107 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DeviceInterfaceModule;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This is an example LinearOpMode that shows how to use the digital inputs and outputs on the
* the Modern Robotics Device Interface Module. In addition, it shows how to use the Red and Blue LED
*
* This op mode assumes that there is a Device Interface Module attached, named 'dim'.
* On this DIM there is a digital input named 'digin' and an output named 'digout'
*
* To fully exercise this sample, connect pin 3 of the digin connector to pin 3 of the digout.
* Note: Pin 1 is indicated by the black stripe, so pin 3 is at the opposite end.
*
* The X button on the gamepad will be used to activate the digital output pin.
* The Red/Blue LED will be used to indicate the state of the digital input pin.
* Blue = false (0V), Red = true (5V)
* If the two pins are linked, the gamepad will change the LED color.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: DIM DIO", group = "Sensor")
@Disabled
public class SensorDIO extends LinearOpMode {
final int BLUE_LED_CHANNEL = 0;
final int RED_LED_CHANNEL = 1;
@Override
public void runOpMode() {
boolean inputPin; // Input State
boolean outputPin; // Output State
DeviceInterfaceModule dim; // Device Object
DigitalChannel digIn; // Device Object
DigitalChannel digOut; // Device Object
// get a reference to a Modern Robotics DIM, and IO channels.
dim = hardwareMap.get(DeviceInterfaceModule.class, "dim"); // Use generic form of device mapping
digIn = hardwareMap.get(DigitalChannel.class, "digin"); // Use generic form of device mapping
digOut = hardwareMap.get(DigitalChannel.class, "digout"); // Use generic form of device mapping
digIn.setMode(DigitalChannel.Mode.INPUT); // Set the direction of each channel
digOut.setMode(DigitalChannel.Mode.OUTPUT);
// wait for the start button to be pressed.
telemetry.addData(">", "Press play, and then user X button to set DigOut");
telemetry.update();
waitForStart();
while (opModeIsActive()) {
outputPin = gamepad1.x ; // Set the output pin based on x button
digOut.setState(outputPin);
inputPin = digIn.getState(); // Read the input pin
// Display input pin state on LEDs
if (inputPin) {
dim.setLED(RED_LED_CHANNEL, true);
dim.setLED(BLUE_LED_CHANNEL, false);
}
else {
dim.setLED(RED_LED_CHANNEL, false);
dim.setLED(BLUE_LED_CHANNEL, true);
}
telemetry.addData("Output", outputPin );
telemetry.addData("Input", inputPin );
telemetry.addData("LED", inputPin ? "Red" : "Blue" );
telemetry.update();
}
}
}

View File

@ -0,0 +1,88 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DigitalChannel;
/*
* This is an example LinearOpMode that shows how to use
* a REV Robotics Touch Sensor.
*
* It assumes that the touch sensor is configured with a name of "sensor_digital".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list.
*/
@TeleOp(name = "Sensor: Digital touch", group = "Sensor")
@Disabled
public class SensorDigitalTouch extends LinearOpMode {
/**
* The REV Robotics Touch Sensor
* is treated as a digital channel. It is HIGH if the button is unpressed.
* It pulls LOW if the button is pressed.
*
* Also, when you connect a REV Robotics Touch Sensor to the digital I/O port on the
* Expansion Hub using a 4-wire JST cable, the second pin gets connected to the Touch Sensor.
* The lower (first) pin stays unconnected.*
*/
DigitalChannel digitalTouch; // Hardware Device Object
@Override
public void runOpMode() {
// get a reference to our digitalTouch object.
digitalTouch = hardwareMap.get(DigitalChannel.class, "sensor_digital");
// set the digital channel to input.
digitalTouch.setMode(DigitalChannel.Mode.INPUT);
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the light levels.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
// if the digital channel returns true it's HIGH and the button is unpressed.
if (digitalTouch.getState() == true) {
telemetry.addData("Digital Touch", "Is Not Pressed");
} else {
telemetry.addData("Digital Touch", "Is Pressed");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,129 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.kauailabs.NavxMicroNavigationSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
import org.firstinspires.ftc.robotcore.external.navigation.Orientation;
/*
* This is an example LinearOpMode that shows how to use Kauai Labs navX Micro Robotics Navigation
* Sensor. It assumes that the sensor is configured with a name of "navx".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: KL navX Micro", group = "Sensor")
@Disabled
public class SensorKLNavxMicro extends LinearOpMode {
/** In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link NavxMicroNavigationSensor}, by contrast, provides functionality that
* is unique to the navX Micro sensor.
*/
IntegratingGyroscope gyro;
NavxMicroNavigationSensor navxMicro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override public void runOpMode() throws InterruptedException {
// Get a reference to a Modern Robotics GyroSensor object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
navxMicro = hardwareMap.get(NavxMicroNavigationSensor.class, "navx");
gyro = (IntegratingGyroscope)navxMicro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "navx");
// The gyro automatically starts calibrating. This takes a few seconds.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
// Wait until the gyro calibration is complete
timer.reset();
while (navxMicro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
Thread.sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
while (opModeIsActive()) {
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
Orientation angles = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES);
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addLine()
.addData("heading", formatAngle(angles.angleUnit, angles.firstAngle))
.addData("roll", formatAngle(angles.angleUnit, angles.secondAngle))
.addData("pitch", "%s deg", formatAngle(angles.angleUnit, angles.thirdAngle));
telemetry.update();
idle(); // Always call idle() at the bottom of your while(opModeIsActive()) loop
}
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatAngle(AngleUnit angleUnit, double angle) {
return formatDegrees(AngleUnit.DEGREES.fromUnit(angleUnit, angle));
}
String formatDegrees(double degrees){
return String.format("%.1f", AngleUnit.DEGREES.normalize(degrees));
}
}

View File

@ -0,0 +1,139 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import android.app.Activity;
import android.graphics.Color;
import android.view.View;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.ColorSensor;
/*
*
* This is an example LinearOpMode that shows how to use
* a Modern Robotics Color Sensor.
*
* The op mode assumes that the color sensor
* is configured with a name of "sensor_color".
*
* You can use the X button on gamepad1 to toggle the LED on and off.
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Color", group = "Sensor")
@Disabled
public class SensorMRColor extends LinearOpMode {
ColorSensor colorSensor; // Hardware Device Object
@Override
public void runOpMode() {
// hsvValues is an array that will hold the hue, saturation, and value information.
float hsvValues[] = {0F,0F,0F};
// values is a reference to the hsvValues array.
final float values[] = hsvValues;
// get a reference to the RelativeLayout so we can change the background
// color of the Robot Controller app to match the hue detected by the RGB sensor.
int relativeLayoutId = hardwareMap.appContext.getResources().getIdentifier("RelativeLayout", "id", hardwareMap.appContext.getPackageName());
final View relativeLayout = ((Activity) hardwareMap.appContext).findViewById(relativeLayoutId);
// bPrevState and bCurrState represent the previous and current state of the button.
boolean bPrevState = false;
boolean bCurrState = false;
// bLedOn represents the state of the LED.
boolean bLedOn = true;
// get a reference to our ColorSensor object.
colorSensor = hardwareMap.get(ColorSensor.class, "sensor_color");
// Set the LED in the beginning
colorSensor.enableLed(bLedOn);
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the RGB data.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// check the status of the x button on either gamepad.
bCurrState = gamepad1.x;
// check for button state transitions.
if (bCurrState && (bCurrState != bPrevState)) {
// button is transitioning to a pressed state. So Toggle LED
bLedOn = !bLedOn;
colorSensor.enableLed(bLedOn);
}
// update previous state variable.
bPrevState = bCurrState;
// convert the RGB values to HSV values.
Color.RGBToHSV(colorSensor.red() * 8, colorSensor.green() * 8, colorSensor.blue() * 8, hsvValues);
// send the info back to driver station using telemetry function.
telemetry.addData("LED", bLedOn ? "On" : "Off");
telemetry.addData("Clear", colorSensor.alpha());
telemetry.addData("Red ", colorSensor.red());
telemetry.addData("Green", colorSensor.green());
telemetry.addData("Blue ", colorSensor.blue());
telemetry.addData("Hue", hsvValues[0]);
// change the background color to match the color detected by the RGB sensor.
// pass a reference to the hue, saturation, and value array as an argument
// to the HSVToColor method.
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.HSVToColor(0xff, values));
}
});
telemetry.update();
}
// Set the panel back to the default color
relativeLayout.post(new Runnable() {
public void run() {
relativeLayout.setBackgroundColor(Color.WHITE);
}
});
}
}

View File

@ -0,0 +1,148 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cCompassSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.CompassSensor;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.Acceleration;
/**
* The {@link SensorMRCompass} op mode provides a demonstration of the
* functionality provided by the Modern Robotics compass sensor.
*
* The op mode assumes that the MR compass is configured with a name of "compass".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* @see <a href="http://www.modernroboticsinc.com/compass">MR Compass Sensor</a>
*/
@TeleOp(name = "Sensor: MR compass", group = "Sensor")
@Disabled // comment out or remove this line to enable this opmode
public class SensorMRCompass extends LinearOpMode {
ModernRoboticsI2cCompassSensor compass;
ElapsedTime timer = new ElapsedTime();
@Override public void runOpMode() {
// get a reference to our compass
compass = hardwareMap.get(ModernRoboticsI2cCompassSensor.class, "compass");
telemetry.log().setCapacity(20);
telemetry.log().add("The compass sensor operates quite well out-of-the");
telemetry.log().add("box, as shipped by the manufacturer. Precision can");
telemetry.log().add("however be somewhat improved with calibration.");
telemetry.log().add("");
telemetry.log().add("To calibrate the compass once the opmode is");
telemetry.log().add("started, make sure the compass is level, then");
telemetry.log().add("press 'A' on the gamepad. Next, slowly rotate the ");
telemetry.log().add("compass in a full 360 degree circle while keeping");
telemetry.log().add("it level. When complete, press 'B'.");
// wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
while (opModeIsActive()) {
// If the A button is pressed, start calibration and wait for the A button to rise
if (gamepad1.a && !compass.isCalibrating()) {
telemetry.log().clear();
telemetry.log().add("Calibration started");
telemetry.log().add("Slowly rotate compass 360deg");
telemetry.log().add("Press 'B' when complete");
compass.setMode(CompassSensor.CompassMode.CALIBRATION_MODE);
timer.reset();
while (gamepad1.a && opModeIsActive()) {
doTelemetry();
idle();
}
}
// If the B button is pressed, stop calibration and wait for the B button to rise
if (gamepad1.b && compass.isCalibrating()) {
telemetry.log().clear();
telemetry.log().add("Calibration complete");
compass.setMode(CompassSensor.CompassMode.MEASUREMENT_MODE);
if (compass.calibrationFailed()) {
telemetry.log().add("Calibration failed");
compass.writeCommand(ModernRoboticsI2cCompassSensor.Command.NORMAL);
}
while (gamepad1.a && opModeIsActive()) {
doTelemetry();
idle();
}
}
doTelemetry();
}
}
protected void doTelemetry() {
if (compass.isCalibrating()) {
telemetry.addData("compass", "calibrating %s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
} else {
// getDirection() returns a traditional compass heading in the range [0,360),
// with values increasing in a CW direction
telemetry.addData("heading", "%.1f", compass.getDirection());
// getAcceleration() returns the current 3D acceleration experienced by
// the sensor. This is used internally to the sensor to compute its tilt and thence
// to correct the magnetometer reading to produce tilt-corrected values in getDirection()
Acceleration accel = compass.getAcceleration();
double accelMagnitude = Math.sqrt(accel.xAccel*accel.xAccel + accel.yAccel*accel.yAccel + accel.zAccel*accel.zAccel);
telemetry.addData("accel", accel);
telemetry.addData("accel magnitude", "%.3f", accelMagnitude);
// getMagneticFlux returns the 3D magnetic field flux experienced by the sensor
telemetry.addData("mag flux", compass.getMagneticFlux());
}
// the command register provides status data
telemetry.addData("command", "%s", compass.readCommand());
telemetry.update();
}
}

View File

@ -0,0 +1,162 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cGyro;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.IntegratingGyroscope;
import com.qualcomm.robotcore.util.ElapsedTime;
import org.firstinspires.ftc.robotcore.external.navigation.AngleUnit;
import org.firstinspires.ftc.robotcore.external.navigation.AngularVelocity;
import org.firstinspires.ftc.robotcore.external.navigation.AxesOrder;
import org.firstinspires.ftc.robotcore.external.navigation.AxesReference;
/*
* This is an example LinearOpMode that shows how to use the Modern Robotics Gyro.
*
* The op mode assumes that the gyro sensor is attached to a Device Interface Module
* I2C channel and is configured with a name of "gyro".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR Gyro", group = "Sensor")
@Disabled
public class SensorMRGyro extends LinearOpMode {
/** In this sample, for illustration purposes we use two interfaces on the one gyro object.
* That's likely atypical: you'll probably use one or the other in any given situation,
* depending on what you're trying to do. {@link IntegratingGyroscope} (and it's base interface,
* {@link Gyroscope}) are common interfaces supported by possibly several different gyro
* implementations. {@link ModernRoboticsI2cGyro}, by contrast, provides functionality that
* is unique to the Modern Robotics gyro sensor.
*/
IntegratingGyroscope gyro;
ModernRoboticsI2cGyro modernRoboticsI2cGyro;
// A timer helps provide feedback while calibration is taking place
ElapsedTime timer = new ElapsedTime();
@Override
public void runOpMode() {
boolean lastResetState = false;
boolean curResetState = false;
// Get a reference to a Modern Robotics gyro object. We use several interfaces
// on this object to illustrate which interfaces support which functionality.
modernRoboticsI2cGyro = hardwareMap.get(ModernRoboticsI2cGyro.class, "gyro");
gyro = (IntegratingGyroscope)modernRoboticsI2cGyro;
// If you're only interested int the IntegratingGyroscope interface, the following will suffice.
// gyro = hardwareMap.get(IntegratingGyroscope.class, "gyro");
// A similar approach will work for the Gyroscope interface, if that's all you need.
// Start calibrating the gyro. This takes a few seconds and is worth performing
// during the initialization phase at the start of each opMode.
telemetry.log().add("Gyro Calibrating. Do Not Move!");
modernRoboticsI2cGyro.calibrate();
// Wait until the gyro calibration is complete
timer.reset();
while (!isStopRequested() && modernRoboticsI2cGyro.isCalibrating()) {
telemetry.addData("calibrating", "%s", Math.round(timer.seconds())%2==0 ? "|.." : "..|");
telemetry.update();
sleep(50);
}
telemetry.log().clear(); telemetry.log().add("Gyro Calibrated. Press Start.");
telemetry.clear(); telemetry.update();
// Wait for the start button to be pressed
waitForStart();
telemetry.log().clear();
telemetry.log().add("Press A & B to reset heading");
// Loop until we're asked to stop
while (opModeIsActive()) {
// If the A and B buttons are pressed just now, reset Z heading.
curResetState = (gamepad1.a && gamepad1.b);
if (curResetState && !lastResetState) {
modernRoboticsI2cGyro.resetZAxisIntegrator();
}
lastResetState = curResetState;
// The raw() methods report the angular rate of change about each of the
// three axes directly as reported by the underlying sensor IC.
int rawX = modernRoboticsI2cGyro.rawX();
int rawY = modernRoboticsI2cGyro.rawY();
int rawZ = modernRoboticsI2cGyro.rawZ();
int heading = modernRoboticsI2cGyro.getHeading();
int integratedZ = modernRoboticsI2cGyro.getIntegratedZValue();
// Read dimensionalized data from the gyro. This gyro can report angular velocities
// about all three axes. Additionally, it internally integrates the Z axis to
// be able to report an absolute angular Z orientation.
AngularVelocity rates = gyro.getAngularVelocity(AngleUnit.DEGREES);
float zAngle = gyro.getAngularOrientation(AxesReference.INTRINSIC, AxesOrder.ZYX, AngleUnit.DEGREES).firstAngle;
// Read administrative information from the gyro
int zAxisOffset = modernRoboticsI2cGyro.getZAxisOffset();
int zAxisScalingCoefficient = modernRoboticsI2cGyro.getZAxisScalingCoefficient();
telemetry.addLine()
.addData("dx", formatRate(rates.xRotationRate))
.addData("dy", formatRate(rates.yRotationRate))
.addData("dz", "%s deg/s", formatRate(rates.zRotationRate));
telemetry.addData("angle", "%s deg", formatFloat(zAngle));
telemetry.addData("heading", "%3d deg", heading);
telemetry.addData("integrated Z", "%3d", integratedZ);
telemetry.addLine()
.addData("rawX", formatRaw(rawX))
.addData("rawY", formatRaw(rawY))
.addData("rawZ", formatRaw(rawZ));
telemetry.addLine().addData("z offset", zAxisOffset).addData("z coeff", zAxisScalingCoefficient);
telemetry.update();
}
}
String formatRaw(int rawValue) {
return String.format("%d", rawValue);
}
String formatRate(float rate) {
return String.format("%.3f", rate);
}
String formatFloat(float rate) {
return String.format("%.3f", rate);
}
}

View File

@ -0,0 +1,84 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.IrSeekerSensor;
/*
* This is an example LinearOpMode that shows how to use
* the Modern Robotics ITR Seeker
*
* The op mode assumes that the IR Seeker
* is configured with a name of "sensor_ir".
*
* Set the switch on the Modern Robotics IR beacon to 1200 at 180. <br>
* Turn on the IR beacon.
* Make sure the side of the beacon with the LED on is facing the robot. <br>
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR IR Seeker", group = "Sensor")
@Disabled
public class SensorMRIrSeeker extends LinearOpMode {
@Override
public void runOpMode() {
IrSeekerSensor irSeeker; // Hardware Device Object
// get a reference to our GyroSensor object.
irSeeker = hardwareMap.get(IrSeekerSensor.class, "sensor_ir");
// wait for the start button to be pressed.
waitForStart();
while (opModeIsActive()) {
// Ensure we have a IR signal
if (irSeeker.signalDetected())
{
// Display angle and strength
telemetry.addData("Angle", irSeeker.getAngle());
telemetry.addData("Strength", irSeeker.getStrength());
}
else
{
// Display loss of signal
telemetry.addData("Seeker", "Signal Lost");
}
telemetry.update();
}
}
}

View File

@ -0,0 +1,71 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.OpticalDistanceSensor;
/*
* This is an example LinearOpMode that shows how to use
* a Modern Robotics Optical Distance Sensor
* It assumes that the ODS sensor is configured with a name of "sensor_ods".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*/
@TeleOp(name = "Sensor: MR ODS", group = "Sensor")
@Disabled
public class SensorMROpticalDistance extends LinearOpMode {
OpticalDistanceSensor odsSensor; // Hardware Device Object
@Override
public void runOpMode() {
// get a reference to our Light Sensor object.
odsSensor = hardwareMap.get(OpticalDistanceSensor.class, "sensor_ods");
// wait for the start button to be pressed.
waitForStart();
// while the op mode is active, loop and read the light levels.
// Note we use opModeIsActive() as our loop condition because it is an interruptible method.
while (opModeIsActive()) {
// send the info back to driver station using telemetry function.
telemetry.addData("Raw", odsSensor.getRawLightDetected());
telemetry.addData("Normal", odsSensor.getLightDetected());
telemetry.update();
}
}
}

View File

@ -0,0 +1,72 @@
/* Copyright (c) 2017 FIRST. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted (subject to the limitations in the disclaimer below) provided that
* the following conditions are met:
*
* Redistributions of source code must retain the above copyright notice, this list
* of conditions and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or
* other materials provided with the distribution.
*
* Neither the name of FIRST nor the names of its contributors may be used to endorse or
* promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
* LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
* "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
* ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.modernrobotics.ModernRoboticsI2cRangeSensor;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* {@link SensorMRRangeSensor} illustrates how to use the Modern Robotics
* Range Sensor.
*
* The op mode assumes that the range sensor is configured with a name of "sensor_range".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* @see <a href="http://modernroboticsinc.com/range-sensor">MR Range Sensor</a>
*/
@TeleOp(name = "Sensor: MR range sensor", group = "Sensor")
@Disabled // comment out or remove this line to enable this opmode
public class SensorMRRangeSensor extends LinearOpMode {
ModernRoboticsI2cRangeSensor rangeSensor;
@Override public void runOpMode() {
// get a reference to our compass
rangeSensor = hardwareMap.get(ModernRoboticsI2cRangeSensor.class, "sensor_range");
// wait for the start button to be pressed
waitForStart();
while (opModeIsActive()) {
telemetry.addData("raw ultrasonic", rangeSensor.rawUltrasonic());
telemetry.addData("raw optical", rangeSensor.rawOptical());
telemetry.addData("cm optical", "%.2f cm", rangeSensor.cmOptical());
telemetry.addData("cm", "%.2f cm", rangeSensor.getDistance(DistanceUnit.CM));
telemetry.update();
}
}
}

View File

@ -0,0 +1,89 @@
/*
Copyright (c) 2018 FIRST
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of FIRST nor the names of its contributors may be used to
endorse or promote products derived from this software without specific prior
written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR
TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.external.samples;
import com.qualcomm.hardware.rev.Rev2mDistanceSensor;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import org.firstinspires.ftc.robotcore.external.navigation.DistanceUnit;
/**
* {@link SensorREV2mDistance} illustrates how to use the REV Robotics
* Time-of-Flight Range Sensor.
*
* The op mode assumes that the range sensor is configured with a name of "sensor_range".
*
* Use Android Studio to Copy this Class, and Paste it into your team's code folder with a new name.
* Remove or comment out the @Disabled line to add this opmode to the Driver Station OpMode list
*
* @see <a href="http://revrobotics.com">REV Robotics Web Page</a>
*/
@TeleOp(name = "Sensor: REV2mDistance", group = "Sensor")
@Disabled
public class SensorREV2mDistance extends LinearOpMode {
private DistanceSensor sensorRange;
@Override
public void runOpMode() {
// you can use this as a regular DistanceSensor.
sensorRange = hardwareMap.get(DistanceSensor.class, "sensor_range");
// you can also cast this to a Rev2mDistanceSensor if you want to use added
// methods associated with the Rev2mDistanceSensor class.
Rev2mDistanceSensor sensorTimeOfFlight = (Rev2mDistanceSensor)sensorRange;
telemetry.addData(">>", "Press start to continue");
telemetry.update();
waitForStart();
while(opModeIsActive()) {
// generic DistanceSensor methods.
telemetry.addData("deviceName",sensorRange.getDeviceName() );
telemetry.addData("range", String.format("%.01f mm", sensorRange.getDistance(DistanceUnit.MM)));
telemetry.addData("range", String.format("%.01f cm", sensorRange.getDistance(DistanceUnit.CM)));
telemetry.addData("range", String.format("%.01f m", sensorRange.getDistance(DistanceUnit.METER)));
telemetry.addData("range", String.format("%.01f in", sensorRange.getDistance(DistanceUnit.INCH)));
// Rev2mDistanceSensor specific methods.
telemetry.addData("ID", String.format("%x", sensorTimeOfFlight.getModelID()));
telemetry.addData("did time out", Boolean.toString(sensorTimeOfFlight.didTimeoutOccur()));
telemetry.update();
}
}
}

View File

@ -0,0 +1,57 @@
## Caution
No Team-specific code should be placed or modified in this ``.../samples`` folder.
Samples should be Copied from here, and then Pasted into the team's
[/TeamCode/src/main/java/org/firstinspires/ftc/teamcode](../../../../../../../../../../TeamCode/src/main/java/org/firstinspires/ftc/teamcode)
folder, using the Android Studio cut and paste commands. This automatically changes all file and
class names to be consistent. From there, the sample can be modified to suit the team's needs.
For more detailed instructions see the /teamcode readme.
### Naming of Samples
To gain a better understanding of how the samples are organized, and how to interpret the
naming system, it will help to understand the conventions that were used during their creation.
These conventions are described (in detail) in the sample_conventions.md file in this folder.
To summarize: A range of different samples classes will reside in the java/external/samples.
The class names will follow a naming convention which indicates the purpose of each class.
The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Hardware: This is NOT an OpMode, but a helper class that is used to describe
one particular robot's hardware configuration: eg: For the K9 or Pushbot.
Look at any Pushbot sample to see how this can be used in an OpMode.
Teams can copy one of these to their team folder to create their own robot definition.
Pushbot: This is a Sample OpMode that uses the Pushbot robot hardware as a base.
It may be used to provide some standard baseline Pushbot OpModes, or
to demonstrate how a particular sensor or concept can be used directly on the
Pushbot chassis.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name. These OpModes may not produce a drivable robot.
Library: This is a class, or set of classes used to implement some strategy.
These will typically NOT implement a full OpMode. Instead they will be included
by an OpMode to provide some stand-alone capability.
After the prefix, other conventions will apply:
* Sensor class names are constructed as: Sensor - Company - Type
* Hardware class names are constructed as: Hardware - Robot type
* Pushbot class names are constructed as: Pushbot - Mode - Action - OpModetype
* Concept class names are constructed as: Concept - Topic - OpModetype
* Library class names are constructed as: Library - Topic - OpModetype

View File

@ -0,0 +1,118 @@
## Sample Class/Opmode conventions
#### V 1.1.0 8/9/2017
This document defines the FTC Sample OpMode and Class conventions.
### OpMode Name
A range of different samples classes will reside in the java/external/samples folder.
For ease of understanding, the class names will follow a naming convention which indicates
the purpose of each class. The prefix of the name will be one of the following:
Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones Tank Drive examples.
Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended to drive a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
Hardware: This is not an actual OpMode, but a helper class that is used to describe
one particular robot's hardware configuration: eg: For the K9 or Pushbot.
Look at any Pushbot sample to see how this can be used in an OpMode.
Teams can copy one of these to create their own robot definition.
Pushbot: This is a Sample OpMode that uses the Pushbot robot hardware as a base.
It may be used to provide some standard baseline Pushbot opmodes, or
to demonstrate how a particular sensor or concept can be used directly on the
Pushbot chassis.
Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the comments should reference an external doc, guide or tutorial.
Each OpMode should try to only demonstrate a single concept so they are easy to
locate based on their name.
Library: This is a class, or set of classes used to implement some strategy.
These will typically NOT implement a full opmode. Instead they will be included
by an OpMode to provide some stand-alone capability.
After the prefix, other conventions will apply:
* Sensor class names should constructed as: Sensor - Company - Type
* Hardware class names should be constructed as: Hardware - Robot type
* Pushbot class names should be constructed as: Pushbot - Mode - Action - OpModetype
* Concept class names should be constructed as: Concept - Topic - OpModetype
* Library class names should be constructed as: Library - Topic - OpModetype
### Sample OpMode Content/Style
Code is formatted as per the Google Style Guide:
https://google.github.io/styleguide/javaguide.html
With “Sensor” and “Hardware” samples, the code should demonstrate the essential function,
and not be embellished with too much additional “clever” code. If a sensor has special
addressing needs, or has a variety of modes or outputs, these should be demonstrated as
simply as possible.
Special programming methods, or robot control techniques should be reserved for “Concept” Samples,
and where possible, Samples should strive to only demonstrate a single concept,
eg: State machine coding, or a User Menu system, and not combine them into a single “all inclusive”
sample. This will prevent an “all inclusive” Sample being deleted just because one part of it
becomes obsolete.
### Device Configuration Names
The following device names are used in the external samples
** Motors:
left_drive
right_drive
left_arm
** Servos:
left_hand
right_hand
arm
claw
** Sensors:
sensor_color
sensor_ir
sensor_light
sensor_ods
sensor_range
sensor_touch
sensor_color_distance
sensor_digital
digin
digout
** Localization:
compass
gyro
imu
navx
### Device Object Names
Device Object names should use the same words as the devices configuration name, but they
should be re-structured to be a suitable Java variable name. This should keep the same word order,
but adopt the style of beginning with a lower case letter, and then each subsequent word
starting with an upper case letter.
Eg: from the examples above: tool, leftMotor, rightClawServo, rearLightSensor.
Note: Sometimes its helpful to put the device type first, followed by the variant.
eg: motorLeft and motorRight, but this should only be done if the same word order
is used on the device configuration name.
### OpMode code Comments
Sample comments should read like normal code comments, that is, as an explanation of what the
sample code is doing. They should NOT be directives to the user,
like: “insert your joystick code here” as these comments typically arent
detailed enough to be useful. They also often get left in the code and become garbage.
Instead, an example of the joystick code should be shown with a comment describing what it is doing.

View File

@ -0,0 +1,70 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import com.qualcomm.robotcore.eventloop.opmode.OpModeManager;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import org.firstinspires.ftc.robotcontroller.external.samples.ConceptNullOp;
/**
* {@link FtcOpModeRegister} is responsible for registering opmodes for use in an FTC game.
* @see #register(OpModeManager)
*/
public class FtcOpModeRegister implements OpModeRegister {
/**
* {@link #register(OpModeManager)} is called by the SDK game in order to register
* OpMode classes or instances that will participate in an FTC game.
*
* There are two mechanisms by which an OpMode may be registered.
*
* 1) The preferred method is by means of class annotations in the OpMode itself.
* See, for example the class annotations in {@link ConceptNullOp}.
*
* 2) The other, retired, method is to modify this {@link #register(OpModeManager)}
* method to include explicit calls to OpModeManager.register().
* This method of modifying this file directly is discouraged, as it
* makes updates to the SDK harder to integrate into your code.
*
* @param manager the object which contains methods for carrying out OpMode registrations
*
* @see com.qualcomm.robotcore.eventloop.opmode.TeleOp
* @see com.qualcomm.robotcore.eventloop.opmode.Autonomous
*/
public void register(OpModeManager manager) {
/**
* Any manual OpMode class registrations should go here.
*/
}
}

View File

@ -0,0 +1,859 @@
/* Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */
package org.firstinspires.ftc.robotcontroller.internal;
import android.app.ActionBar;
import android.app.Activity;
import android.app.ActivityManager;
import android.content.ComponentName;
import android.content.Context;
import android.content.Intent;
import android.content.ServiceConnection;
import android.content.SharedPreferences;
import android.content.res.Configuration;
import android.hardware.usb.UsbDevice;
import android.hardware.usb.UsbManager;
import android.net.wifi.WifiManager;
import android.os.Bundle;
import android.os.IBinder;
import android.preference.PreferenceManager;
import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.annotation.StringRes;
import android.view.Menu;
import android.view.MenuItem;
import android.view.MotionEvent;
import android.view.View;
import android.view.WindowManager;
import android.webkit.WebView;
import android.widget.ImageButton;
import android.widget.LinearLayout;
import android.widget.LinearLayout.LayoutParams;
import android.widget.PopupMenu;
import android.widget.TextView;
import com.google.blocks.ftcrobotcontroller.ProgrammingWebHandlers;
import com.google.blocks.ftcrobotcontroller.runtime.BlocksOpMode;
import com.qualcomm.ftccommon.ClassManagerFactory;
import com.qualcomm.ftccommon.FtcAboutActivity;
import com.qualcomm.ftccommon.FtcEventLoop;
import com.qualcomm.ftccommon.FtcEventLoopIdle;
import com.qualcomm.ftccommon.FtcRobotControllerService;
import com.qualcomm.ftccommon.FtcRobotControllerService.FtcRobotControllerBinder;
import com.qualcomm.ftccommon.FtcRobotControllerSettingsActivity;
import com.qualcomm.ftccommon.LaunchActivityConstantsList;
import com.qualcomm.ftccommon.LaunchActivityConstantsList.RequestCode;
import com.qualcomm.ftccommon.Restarter;
import com.qualcomm.ftccommon.UpdateUI;
import com.qualcomm.ftccommon.configuration.EditParameters;
import com.qualcomm.ftccommon.configuration.FtcLoadFileActivity;
import com.qualcomm.ftccommon.configuration.RobotConfigFile;
import com.qualcomm.ftccommon.configuration.RobotConfigFileManager;
import com.qualcomm.ftcrobotcontroller.BuildConfig;
import com.qualcomm.ftcrobotcontroller.R;
import com.qualcomm.hardware.HardwareFactory;
import com.qualcomm.robotcore.eventloop.EventLoopManager;
import com.qualcomm.robotcore.eventloop.opmode.FtcRobotControllerServiceState;
import com.qualcomm.robotcore.eventloop.opmode.OpModeRegister;
import com.qualcomm.robotcore.hardware.configuration.LynxConstants;
import com.qualcomm.robotcore.hardware.configuration.Utility;
import com.qualcomm.robotcore.robot.Robot;
import com.qualcomm.robotcore.robot.RobotState;
import com.qualcomm.robotcore.util.ClockWarningSource;
import com.qualcomm.robotcore.util.Device;
import com.qualcomm.robotcore.util.Dimmer;
import com.qualcomm.robotcore.util.ImmersiveMode;
import com.qualcomm.robotcore.util.RobotLog;
import com.qualcomm.robotcore.util.WebServer;
import com.qualcomm.robotcore.wifi.NetworkConnection;
import com.qualcomm.robotcore.wifi.NetworkConnectionFactory;
import com.qualcomm.robotcore.wifi.NetworkType;
import org.firstinspires.ftc.ftccommon.external.SoundPlayingRobotMonitor;
import org.firstinspires.ftc.ftccommon.internal.AnnotatedHooksClassFilter;
import org.firstinspires.ftc.ftccommon.internal.FtcRobotControllerWatchdogService;
import org.firstinspires.ftc.ftccommon.internal.ProgramAndManageActivity;
import org.firstinspires.ftc.onbotjava.ExternalLibraries;
import org.firstinspires.ftc.onbotjava.OnBotJavaHelperImpl;
import org.firstinspires.ftc.onbotjava.OnBotJavaProgrammingMode;
import org.firstinspires.ftc.robotcore.external.navigation.MotionDetection;
import org.firstinspires.ftc.robotcore.internal.hardware.android.AndroidBoard;
import org.firstinspires.ftc.robotcore.internal.network.DeviceNameManagerFactory;
import org.firstinspires.ftc.robotcore.internal.network.PreferenceRemoterRC;
import org.firstinspires.ftc.robotcore.internal.network.StartResult;
import org.firstinspires.ftc.robotcore.internal.network.WifiDirectChannelChanger;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteEvent;
import org.firstinspires.ftc.robotcore.internal.network.WifiMuteStateMachine;
import org.firstinspires.ftc.robotcore.internal.opmode.ClassManager;
import org.firstinspires.ftc.robotcore.internal.opmode.OnBotJavaHelper;
import org.firstinspires.ftc.robotcore.internal.system.AppAliveNotifier;
import org.firstinspires.ftc.robotcore.internal.system.AppUtil;
import org.firstinspires.ftc.robotcore.internal.system.Assert;
import org.firstinspires.ftc.robotcore.internal.system.PreferencesHelper;
import org.firstinspires.ftc.robotcore.internal.system.ServiceController;
import org.firstinspires.ftc.robotcore.internal.ui.ThemedActivity;
import org.firstinspires.ftc.robotcore.internal.ui.UILocation;
import org.firstinspires.ftc.robotcore.internal.webserver.RobotControllerWebInfo;
import org.firstinspires.ftc.robotserver.internal.programmingmode.ProgrammingModeManager;
import org.firstinspires.inspection.RcInspectionActivity;
import org.threeten.bp.YearMonth;
import org.xmlpull.v1.XmlPullParserException;
import java.io.FileNotFoundException;
import java.util.List;
import java.util.Queue;
import java.util.concurrent.ConcurrentLinkedQueue;
@SuppressWarnings("WeakerAccess")
public class FtcRobotControllerActivity extends Activity
{
public static final String TAG = "RCActivity";
public String getTag() { return TAG; }
private static final int REQUEST_CONFIG_WIFI_CHANNEL = 1;
private static final int NUM_GAMEPADS = 2;
protected WifiManager.WifiLock wifiLock;
protected RobotConfigFileManager cfgFileMgr;
private OnBotJavaHelper onBotJavaHelper;
protected ProgrammingModeManager programmingModeManager;
protected UpdateUI.Callback callback;
protected Context context;
protected Utility utility;
protected StartResult prefRemoterStartResult = new StartResult();
protected StartResult deviceNameStartResult = new StartResult();
protected PreferencesHelper preferencesHelper;
protected final SharedPreferencesListener sharedPreferencesListener = new SharedPreferencesListener();
protected ImageButton buttonMenu;
protected TextView textDeviceName;
protected TextView textNetworkConnectionStatus;
protected TextView textRobotStatus;
protected TextView[] textGamepad = new TextView[NUM_GAMEPADS];
protected TextView textOpMode;
protected TextView textErrorMessage;
protected ImmersiveMode immersion;
protected UpdateUI updateUI;
protected Dimmer dimmer;
protected LinearLayout entireScreenLayout;
protected FtcRobotControllerService controllerService;
protected NetworkType networkType;
protected FtcEventLoop eventLoop;
protected Queue<UsbDevice> receivedUsbAttachmentNotifications;
protected WifiMuteStateMachine wifiMuteStateMachine;
protected MotionDetection motionDetection;
private static boolean permissionsValidated = false;
private WifiDirectChannelChanger wifiDirectChannelChanger;
protected class RobotRestarter implements Restarter {
public void requestRestart() {
requestRobotRestart();
}
}
protected boolean serviceShouldUnbind = false;
protected ServiceConnection connection = new ServiceConnection() {
@Override
public void onServiceConnected(ComponentName name, IBinder service) {
FtcRobotControllerBinder binder = (FtcRobotControllerBinder) service;
onServiceBind(binder.getService());
}
@Override
public void onServiceDisconnected(ComponentName name) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=null", TAG);
controllerService = null;
}
};
@Override
protected void onNewIntent(Intent intent) {
super.onNewIntent(intent);
if (UsbManager.ACTION_USB_DEVICE_ATTACHED.equals(intent.getAction())) {
UsbDevice usbDevice = intent.getParcelableExtra(UsbManager.EXTRA_DEVICE);
RobotLog.vv(TAG, "ACTION_USB_DEVICE_ATTACHED: %s", usbDevice.getDeviceName());
if (usbDevice != null) { // paranoia
// We might get attachment notifications before the event loop is set up, so
// we hold on to them and pass them along only when we're good and ready.
if (receivedUsbAttachmentNotifications != null) { // *total* paranoia
receivedUsbAttachmentNotifications.add(usbDevice);
passReceivedUsbAttachmentsToEventLoop();
}
}
}
}
protected void passReceivedUsbAttachmentsToEventLoop() {
if (this.eventLoop != null) {
for (;;) {
UsbDevice usbDevice = receivedUsbAttachmentNotifications.poll();
if (usbDevice == null)
break;
this.eventLoop.onUsbDeviceAttached(usbDevice);
}
}
else {
// Paranoia: we don't want the pending list to grow without bound when we don't
// (yet) have an event loop
while (receivedUsbAttachmentNotifications.size() > 100) {
receivedUsbAttachmentNotifications.poll();
}
}
}
/**
* There are cases where a permission may be revoked and the system restart will restart the
* FtcRobotControllerActivity, instead of the launch activity. Detect when that happens, and throw
* the device back to the permission validator activity.
*/
protected boolean enforcePermissionValidator() {
if (!permissionsValidated) {
RobotLog.vv(TAG, "Redirecting to permission validator");
Intent permissionValidatorIntent = new Intent(AppUtil.getDefContext(), PermissionValidatorWrapper.class);
startActivity(permissionValidatorIntent);
finish();
return true;
} else {
RobotLog.vv(TAG, "Permissions validated already");
return false;
}
}
public static void setPermissionsValidated() {
permissionsValidated = true;
}
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
if (enforcePermissionValidator()) {
return;
}
RobotLog.onApplicationStart(); // robustify against onCreate() following onDestroy() but using the same app instance, which apparently does happen
RobotLog.vv(TAG, "onCreate()");
ThemedActivity.appAppThemeToActivity(getTag(), this); // do this way instead of inherit to help AppInventor
// Oddly, sometimes after a crash & restart the root activity will be something unexpected, like from the before crash? We don't yet understand
RobotLog.vv(TAG, "rootActivity is of class %s", AppUtil.getInstance().getRootActivity().getClass().getSimpleName());
RobotLog.vv(TAG, "launchActivity is of class %s", FtcRobotControllerWatchdogService.launchActivity());
Assert.assertTrue(FtcRobotControllerWatchdogService.isLaunchActivity(AppUtil.getInstance().getRootActivity()));
Assert.assertTrue(AppUtil.getInstance().isRobotController());
// Quick check: should we pretend we're not here, and so allow the Lynx to operate as
// a stand-alone USB-connected module?
if (LynxConstants.isRevControlHub()) {
// Double-sure check that we can talk to the DB over the serial TTY
AndroidBoard.getInstance().getAndroidBoardIsPresentPin().setState(true);
}
context = this;
utility = new Utility(this);
DeviceNameManagerFactory.getInstance().start(deviceNameStartResult);
PreferenceRemoterRC.getInstance().start(prefRemoterStartResult);
receivedUsbAttachmentNotifications = new ConcurrentLinkedQueue<UsbDevice>();
eventLoop = null;
setContentView(R.layout.activity_ftc_controller);
preferencesHelper = new PreferencesHelper(TAG, context);
preferencesHelper.writeBooleanPrefIfDifferent(context.getString(R.string.pref_rc_connected), true);
preferencesHelper.getSharedPreferences().registerOnSharedPreferenceChangeListener(sharedPreferencesListener);
// Check if this RC app is from a later FTC season that what was installed previously
int ftcSeasonYearOfPreviouslyInstalledRc = preferencesHelper.readInt(getString(R.string.pref_ftc_season_year_of_current_rc), 0);
int ftcSeasonYearOfCurrentlyInstalledRc = AppUtil.getInstance().getFtcSeasonYear(YearMonth.now()).getValue();
if (ftcSeasonYearOfCurrentlyInstalledRc > ftcSeasonYearOfPreviouslyInstalledRc) {
preferencesHelper.writeIntPrefIfDifferent(getString(R.string.pref_ftc_season_year_of_current_rc), ftcSeasonYearOfCurrentlyInstalledRc);
// Since it's a new FTC season, we should reset certain settings back to their default values.
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_2_4_ghz_band), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_obsolete_software), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_mismatched_app_versions), true);
preferencesHelper.writeBooleanPrefIfDifferent(getString(R.string.pref_warn_about_incorrect_clocks), true);
}
entireScreenLayout = (LinearLayout) findViewById(R.id.entire_screen);
buttonMenu = (ImageButton) findViewById(R.id.menu_buttons);
buttonMenu.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
PopupMenu popupMenu = new PopupMenu(FtcRobotControllerActivity.this, v);
popupMenu.setOnMenuItemClickListener(new PopupMenu.OnMenuItemClickListener() {
@Override
public boolean onMenuItemClick(MenuItem item) {
return onOptionsItemSelected(item); // Delegate to the handler for the hardware menu button
}
});
popupMenu.inflate(R.menu.ftc_robot_controller);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(
FtcRobotControllerActivity.this, popupMenu.getMenu());
popupMenu.show();
}
});
updateMonitorLayout(getResources().getConfiguration());
BlocksOpMode.setActivityAndWebView(this, (WebView) findViewById(R.id.webViewBlocksRuntime));
ExternalLibraries.getInstance().onCreate();
onBotJavaHelper = new OnBotJavaHelperImpl();
/*
* Paranoia as the ClassManagerFactory requires EXTERNAL_STORAGE permissions
* and we've seen on the DS where the finish() call above does not short-circuit
* the onCreate() call for the activity and then we crash here because we don't
* have permissions. So...
*/
if (permissionsValidated) {
ClassManager.getInstance().setOnBotJavaClassHelper(onBotJavaHelper);
ClassManagerFactory.registerFilters();
ClassManagerFactory.processAllClasses();
}
cfgFileMgr = new RobotConfigFileManager(this);
// Clean up 'dirty' status after a possible crash
RobotConfigFile configFile = cfgFileMgr.getActiveConfig();
if (configFile.isDirty()) {
configFile.markClean();
cfgFileMgr.setActiveConfig(false, configFile);
}
textDeviceName = (TextView) findViewById(R.id.textDeviceName);
textNetworkConnectionStatus = (TextView) findViewById(R.id.textNetworkConnectionStatus);
textRobotStatus = (TextView) findViewById(R.id.textRobotStatus);
textOpMode = (TextView) findViewById(R.id.textOpMode);
textErrorMessage = (TextView) findViewById(R.id.textErrorMessage);
textGamepad[0] = (TextView) findViewById(R.id.textGamepad1);
textGamepad[1] = (TextView) findViewById(R.id.textGamepad2);
immersion = new ImmersiveMode(getWindow().getDecorView());
dimmer = new Dimmer(this);
dimmer.longBright();
programmingModeManager = new ProgrammingModeManager();
programmingModeManager.register(new ProgrammingWebHandlers());
programmingModeManager.register(new OnBotJavaProgrammingMode());
updateUI = createUpdateUI();
callback = createUICallback(updateUI);
PreferenceManager.setDefaultValues(this, R.xml.app_settings, false);
WifiManager wifiManager = (WifiManager) getApplicationContext().getSystemService(Context.WIFI_SERVICE);
wifiLock = wifiManager.createWifiLock(WifiManager.WIFI_MODE_FULL_HIGH_PERF, "");
hittingMenuButtonBrightensScreen();
wifiLock.acquire();
callback.networkConnectionUpdate(NetworkConnection.NetworkEvent.DISCONNECTED);
readNetworkType();
ServiceController.startService(FtcRobotControllerWatchdogService.class);
bindToService();
logPackageVersions();
logDeviceSerialNumber();
AndroidBoard.getInstance().logAndroidBoardInfo();
RobotLog.logDeviceInfo();
if (preferencesHelper.readBoolean(getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
}
FtcAboutActivity.setBuildTimeFromBuildConfig(BuildConfig.APP_BUILD_TIME);
// check to see if there is a preferred Wi-Fi to use.
checkPreferredChannel();
AnnotatedHooksClassFilter.getInstance().callOnCreateMethods(this);
}
protected UpdateUI createUpdateUI() {
Restarter restarter = new RobotRestarter();
UpdateUI result = new UpdateUI(this, dimmer);
result.setRestarter(restarter);
result.setTextViews(textNetworkConnectionStatus, textRobotStatus, textGamepad, textOpMode, textErrorMessage, textDeviceName);
return result;
}
protected UpdateUI.Callback createUICallback(UpdateUI updateUI) {
UpdateUI.Callback result = updateUI.new Callback();
result.setStateMonitor(new SoundPlayingRobotMonitor());
return result;
}
@Override
protected void onStart() {
super.onStart();
RobotLog.vv(TAG, "onStart()");
entireScreenLayout.setOnTouchListener(new View.OnTouchListener() {
@Override
public boolean onTouch(View v, MotionEvent event) {
dimmer.handleDimTimer();
return false;
}
});
}
@Override
protected void onResume() {
super.onResume();
RobotLog.vv(TAG, "onResume()");
// In case the user just got back from fixing their clock, refresh ClockWarningSource
ClockWarningSource.getInstance().onPossibleRcClockUpdate();
}
@Override
protected void onPause() {
super.onPause();
RobotLog.vv(TAG, "onPause()");
}
@Override
protected void onStop() {
// Note: this gets called even when the configuration editor is launched. That is, it gets
// called surprisingly often. So, we don't actually do much here.
super.onStop();
RobotLog.vv(TAG, "onStop()");
}
@Override
protected void onDestroy() {
super.onDestroy();
RobotLog.vv(TAG, "onDestroy()");
shutdownRobot(); // Ensure the robot is put away to bed
if (callback != null) callback.close();
PreferenceRemoterRC.getInstance().stop(prefRemoterStartResult);
DeviceNameManagerFactory.getInstance().stop(deviceNameStartResult);
unbindFromService();
// If the app manually (?) is stopped, then we don't need the auto-starting function (?)
ServiceController.stopService(FtcRobotControllerWatchdogService.class);
if (wifiLock != null) wifiLock.release();
if (preferencesHelper != null) preferencesHelper.getSharedPreferences().unregisterOnSharedPreferenceChangeListener(sharedPreferencesListener);
RobotLog.cancelWriteLogcatToDisk();
AnnotatedHooksClassFilter.getInstance().callOnDestroyMethods(this);
}
protected void bindToService() {
readNetworkType();
Intent intent = new Intent(this, FtcRobotControllerService.class);
intent.putExtra(NetworkConnectionFactory.NETWORK_CONNECTION_TYPE, networkType);
serviceShouldUnbind = bindService(intent, connection, Context.BIND_AUTO_CREATE);
}
protected void unbindFromService() {
if (serviceShouldUnbind) {
unbindService(connection);
serviceShouldUnbind = false;
}
}
protected void logPackageVersions() {
RobotLog.logBuildConfig(com.qualcomm.ftcrobotcontroller.BuildConfig.class);
RobotLog.logBuildConfig(com.qualcomm.robotcore.BuildConfig.class);
RobotLog.logBuildConfig(com.qualcomm.hardware.BuildConfig.class);
RobotLog.logBuildConfig(com.qualcomm.ftccommon.BuildConfig.class);
RobotLog.logBuildConfig(com.google.blocks.BuildConfig.class);
RobotLog.logBuildConfig(org.firstinspires.inspection.BuildConfig.class);
}
protected void logDeviceSerialNumber() {
RobotLog.ii(TAG, "Android device serial number: " + Device.getSerialNumberOrUnknown());
}
protected void readNetworkType() {
// Control hubs are always running the access point model. Everything else, for the time
// being always runs the Wi-Fi Direct model.
if (Device.isRevControlHub() == true) {
networkType = NetworkType.RCWIRELESSAP;
} else {
networkType = NetworkType.fromString(preferencesHelper.readString(context.getString(R.string.pref_pairing_kind), NetworkType.globalDefaultAsString()));
}
// update the app_settings
preferencesHelper.writeStringPrefIfDifferent(context.getString(R.string.pref_pairing_kind), networkType.toString());
}
@Override
public void onWindowFocusChanged(boolean hasFocus) {
super.onWindowFocusChanged(hasFocus);
if (hasFocus) {
immersion.hideSystemUI();
getWindow().setFlags(WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION, WindowManager.LayoutParams.FLAG_TRANSLUCENT_NAVIGATION);
}
}
@Override
public boolean onCreateOptionsMenu(Menu menu) {
getMenuInflater().inflate(R.menu.ftc_robot_controller, menu);
AnnotatedHooksClassFilter.getInstance().callOnCreateMenuMethods(this, menu);
return true;
}
private boolean isRobotRunning() {
if (controllerService == null) {
return false;
}
Robot robot = controllerService.getRobot();
if ((robot == null) || (robot.eventLoopManager == null)) {
return false;
}
RobotState robotState = robot.eventLoopManager.state;
if (robotState != RobotState.RUNNING) {
return false;
} else {
return true;
}
}
@Override
public boolean onOptionsItemSelected(MenuItem item) {
int id = item.getItemId();
if (id == R.id.action_program_and_manage) {
if (isRobotRunning()) {
Intent programmingModeIntent = new Intent(AppUtil.getDefContext(), ProgramAndManageActivity.class);
RobotControllerWebInfo webInfo = programmingModeManager.getWebServer().getConnectionInformation();
programmingModeIntent.putExtra(LaunchActivityConstantsList.RC_WEB_INFO, webInfo.toJson());
startActivity(programmingModeIntent);
} else {
AppUtil.getInstance().showToast(UILocation.ONLY_LOCAL, context.getString(R.string.toastWifiUpBeforeProgrammingMode));
}
} else if (id == R.id.action_inspection_mode) {
Intent inspectionModeIntent = new Intent(AppUtil.getDefContext(), RcInspectionActivity.class);
startActivity(inspectionModeIntent);
return true;
} else if (id == R.id.action_restart_robot) {
dimmer.handleDimTimer();
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastRestartingRobot));
requestRobotRestart();
return true;
}
else if (id == R.id.action_configure_robot) {
EditParameters parameters = new EditParameters();
Intent intentConfigure = new Intent(AppUtil.getDefContext(), FtcLoadFileActivity.class);
parameters.putIntent(intentConfigure);
startActivityForResult(intentConfigure, RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal());
}
else if (id == R.id.action_settings) {
// historical: this once erroneously used FTC_CONFIGURE_REQUEST_CODE_ROBOT_CONTROLLER
Intent settingsIntent = new Intent(AppUtil.getDefContext(), FtcRobotControllerSettingsActivity.class);
startActivityForResult(settingsIntent, RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal());
return true;
}
else if (id == R.id.action_about) {
Intent intent = new Intent(AppUtil.getDefContext(), FtcAboutActivity.class);
startActivity(intent);
return true;
}
else if (id == R.id.action_exit_app) {
//Clear backstack and everything to prevent edge case where VM might be
//restarted (after it was exited) if more than one activity was on the
//backstack for some reason.
finishAffinity();
//For lollipop and up, we can clear ourselves from the recents list too
if (android.os.Build.VERSION.SDK_INT >= android.os.Build.VERSION_CODES.LOLLIPOP) {
ActivityManager manager = (ActivityManager) getSystemService(ACTIVITY_SERVICE);
List<ActivityManager.AppTask> tasks = manager.getAppTasks();
for (ActivityManager.AppTask task : tasks) {
task.finishAndRemoveTask();
}
}
// Allow the user to use the Control Hub operating system's UI, instead of relaunching the app
AppAliveNotifier.getInstance().disableAppWatchdogUntilNextAppStart();
//Finally, nuke the VM from orbit
AppUtil.getInstance().exitApplication();
return true;
}
return super.onOptionsItemSelected(item);
}
@Override
public void onConfigurationChanged(Configuration newConfig) {
super.onConfigurationChanged(newConfig);
// don't destroy assets on screen rotation
updateMonitorLayout(newConfig);
}
/**
* Updates the orientation of monitorContainer (which contains cameraMonitorView and
* tfodMonitorView) based on the given configuration. Makes the children split the space.
*/
private void updateMonitorLayout(Configuration configuration) {
LinearLayout monitorContainer = (LinearLayout) findViewById(R.id.monitorContainer);
if (configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
// When the phone is landscape, lay out the monitor views horizontally.
monitorContainer.setOrientation(LinearLayout.HORIZONTAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(0, LayoutParams.MATCH_PARENT, 1 /* weight */));
}
} else {
// When the phone is portrait, lay out the monitor views vertically.
monitorContainer.setOrientation(LinearLayout.VERTICAL);
for (int i = 0; i < monitorContainer.getChildCount(); i++) {
View view = monitorContainer.getChildAt(i);
view.setLayoutParams(new LayoutParams(LayoutParams.MATCH_PARENT, 0, 1 /* weight */));
}
}
monitorContainer.requestLayout();
}
@Override
protected void onActivityResult(int request, int result, Intent intent) {
if (request == REQUEST_CONFIG_WIFI_CHANNEL) {
if (result == RESULT_OK) {
AppUtil.getInstance().showToast(UILocation.BOTH, context.getString(R.string.toastWifiConfigurationComplete));
}
}
// was some historical confusion about launch codes here, so we err safely
if (request == RequestCode.CONFIGURE_ROBOT_CONTROLLER.ordinal() || request == RequestCode.SETTINGS_ROBOT_CONTROLLER.ordinal()) {
// We always do a refresh, whether it was a cancel or an OK, for robustness
shutdownRobot();
cfgFileMgr.getActiveConfigAndUpdateUI();
updateUIAndRequestRobotSetup();
}
}
public void onServiceBind(final FtcRobotControllerService service) {
RobotLog.vv(FtcRobotControllerService.TAG, "%s.controllerService=bound", TAG);
controllerService = service;
updateUI.setControllerService(controllerService);
controllerService.setOnBotJavaHelper(onBotJavaHelper);
updateUIAndRequestRobotSetup();
programmingModeManager.setState(new FtcRobotControllerServiceState() {
@NonNull
@Override
public WebServer getWebServer() {
return service.getWebServer();
}
@Nullable
@Override
public OnBotJavaHelper getOnBotJavaHelper() {
return service.getOnBotJavaHelper();
}
@Override
public EventLoopManager getEventLoopManager() {
return service.getRobot().eventLoopManager;
}
});
AnnotatedHooksClassFilter.getInstance().callWebHandlerRegistrarMethods(this,
service.getWebServer().getWebHandlerManager());
}
private void updateUIAndRequestRobotSetup() {
if (controllerService != null) {
callback.networkConnectionUpdate(controllerService.getNetworkConnectionStatus());
callback.updateRobotStatus(controllerService.getRobotStatus());
// Only show this first-time toast on headless systems: what we have now on non-headless suffices
requestRobotSetup(LynxConstants.isRevControlHub()
? new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRobotSetupComplete);
}
}
: null);
}
}
private void requestRobotSetup(@Nullable Runnable runOnComplete) {
if (controllerService == null) return;
RobotConfigFile file = cfgFileMgr.getActiveConfigAndUpdateUI();
HardwareFactory hardwareFactory = new HardwareFactory(context);
try {
hardwareFactory.setXmlPullParser(file.getXml());
} catch (FileNotFoundException | XmlPullParserException e) {
RobotLog.ww(TAG, e, "Unable to set configuration file %s. Falling back on noConfig.", file.getName());
file = RobotConfigFile.noConfig(cfgFileMgr);
try {
hardwareFactory.setXmlPullParser(file.getXml());
cfgFileMgr.setActiveConfigAndUpdateUI(false, file);
} catch (FileNotFoundException | XmlPullParserException e1) {
RobotLog.ee(TAG, e1, "Failed to fall back on noConfig");
}
}
OpModeRegister userOpModeRegister = createOpModeRegister();
eventLoop = new FtcEventLoop(hardwareFactory, userOpModeRegister, callback, this);
FtcEventLoopIdle idleLoop = new FtcEventLoopIdle(hardwareFactory, userOpModeRegister, callback, this);
controllerService.setCallback(callback);
controllerService.setupRobot(eventLoop, idleLoop, runOnComplete);
passReceivedUsbAttachmentsToEventLoop();
AndroidBoard.showErrorIfUnknownControlHub();
AnnotatedHooksClassFilter.getInstance().callOnCreateEventLoopMethods(this, eventLoop);
}
protected OpModeRegister createOpModeRegister() {
return new FtcOpModeRegister();
}
private void shutdownRobot() {
if (controllerService != null) controllerService.shutdownRobot();
}
private void requestRobotRestart() {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(R.string.toastRestartingRobot));
//
RobotLog.clearGlobalErrorMsg();
RobotLog.clearGlobalWarningMsg();
shutdownRobot();
requestRobotSetup(new Runnable() {
@Override public void run() {
showRestartRobotCompleteToast(R.string.toastRestartRobotComplete);
}
});
}
private void showRestartRobotCompleteToast(@StringRes int resid) {
AppUtil.getInstance().showToast(UILocation.BOTH, AppUtil.getDefContext().getString(resid));
}
private void checkPreferredChannel() {
// For P2P network, check to see what preferred channel is.
if (networkType == NetworkType.WIFIDIRECT) {
int prefChannel = preferencesHelper.readInt(getString(com.qualcomm.ftccommon.R.string.pref_wifip2p_channel), -1);
if (prefChannel == -1) {
prefChannel = 0;
RobotLog.vv(TAG, "pref_wifip2p_channel: No preferred channel defined. Will use a default value of %d", prefChannel);
} else {
RobotLog.vv(TAG, "pref_wifip2p_channel: Found existing preferred channel (%d).", prefChannel);
}
// attempt to set the preferred channel.
RobotLog.vv(TAG, "pref_wifip2p_channel: attempting to set preferred channel...");
wifiDirectChannelChanger = new WifiDirectChannelChanger();
wifiDirectChannelChanger.changeToChannel(prefChannel);
}
}
protected void hittingMenuButtonBrightensScreen() {
ActionBar actionBar = getActionBar();
if (actionBar != null) {
actionBar.addOnMenuVisibilityListener(new ActionBar.OnMenuVisibilityListener() {
@Override
public void onMenuVisibilityChanged(boolean isVisible) {
if (isVisible) {
dimmer.handleDimTimer();
}
}
});
}
}
protected class SharedPreferencesListener implements SharedPreferences.OnSharedPreferenceChangeListener {
@Override public void onSharedPreferenceChanged(SharedPreferences sharedPreferences, String key) {
if (key.equals(context.getString(R.string.pref_app_theme))) {
ThemedActivity.restartForAppThemeChange(getTag(), getString(R.string.appThemeChangeRestartNotifyRC));
} else if (key.equals(context.getString(R.string.pref_wifi_automute))) {
if (preferencesHelper.readBoolean(context.getString(R.string.pref_wifi_automute), false)) {
initWifiMute(true);
} else {
initWifiMute(false);
}
}
}
}
protected void initWifiMute(boolean enable) {
if (enable) {
wifiMuteStateMachine = new WifiMuteStateMachine();
wifiMuteStateMachine.initialize();
wifiMuteStateMachine.start();
motionDetection = new MotionDetection(2.0, 10);
motionDetection.startListening();
motionDetection.registerListener(new MotionDetection.MotionDetectionListener() {
@Override
public void onMotionDetected(double vector)
{
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
});
} else {
wifiMuteStateMachine.stop();
wifiMuteStateMachine = null;
motionDetection.stopListening();
motionDetection.purgeListeners();
motionDetection = null;
}
}
@Override
public void onUserInteraction() {
if (wifiMuteStateMachine != null) {
wifiMuteStateMachine.consumeEvent(WifiMuteEvent.USER_ACTIVITY);
}
}
}

View File

@ -0,0 +1,91 @@
/*
* Copyright (c) 2018 Craig MacFarlane
*
* All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification, are permitted
* (subject to the limitations in the disclaimer below) provided that the following conditions are
* met:
*
* Redistributions of source code must retain the above copyright notice, this list of conditions
* and the following disclaimer.
*
* Redistributions in binary form must reproduce the above copyright notice, this list of conditions
* and the following disclaimer in the documentation and/or other materials provided with the
* distribution.
*
* Neither the name of Craig MacFarlane nor the names of its contributors may be used to
* endorse or promote products derived from this software without specific prior written permission.
*
* NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS LICENSE. THIS
* SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED
* WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS
* FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE
* LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
* (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
* CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF
* THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*/
package org.firstinspires.ftc.robotcontroller.internal;
import android.Manifest;
import android.os.Bundle;
import com.qualcomm.ftcrobotcontroller.R;
import org.firstinspires.ftc.robotcore.internal.system.Misc;
import org.firstinspires.ftc.robotcore.internal.system.PermissionValidatorActivity;
import java.util.ArrayList;
import java.util.List;
public class PermissionValidatorWrapper extends PermissionValidatorActivity {
private final String TAG = "PermissionValidatorWrapper";
/*
* The list of dangerous permissions the robot controller needs.
*/
protected List<String> robotControllerPermissions = new ArrayList<String>() {{
add(Manifest.permission.WRITE_EXTERNAL_STORAGE);
add(Manifest.permission.READ_EXTERNAL_STORAGE);
add(Manifest.permission.CAMERA);
add(Manifest.permission.ACCESS_COARSE_LOCATION);
add(Manifest.permission.ACCESS_FINE_LOCATION);
add(Manifest.permission.READ_PHONE_STATE);
}};
private final static Class startApplication = FtcRobotControllerActivity.class;
public String mapPermissionToExplanation(final String permission) {
if (permission.equals(Manifest.permission.WRITE_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcWriteExternalStorageExplain);
} else if (permission.equals(Manifest.permission.READ_EXTERNAL_STORAGE)) {
return Misc.formatForUser(R.string.permRcReadExternalStorageExplain);
} else if (permission.equals(Manifest.permission.CAMERA)) {
return Misc.formatForUser(R.string.permRcCameraExplain);
} else if (permission.equals(Manifest.permission.ACCESS_COARSE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.ACCESS_FINE_LOCATION)) {
return Misc.formatForUser(R.string.permAccessLocationExplain);
} else if (permission.equals(Manifest.permission.READ_PHONE_STATE)) {
return Misc.formatForUser(R.string.permReadPhoneState);
}
return Misc.formatForUser(R.string.permGenericExplain);
}
@Override
protected void onCreate(Bundle savedInstanceState)
{
super.onCreate(savedInstanceState);
permissions = robotControllerPermissions;
}
protected Class onStartApplication()
{
FtcRobotControllerActivity.setPermissionsValidated();
return startApplication;
}
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 975 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.7 KiB

View File

@ -0,0 +1,191 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:style="http://schemas.android.com/apk/res-auto"
tools:context="org.firstinspires.ftc.robotcontroller.internal.FtcRobotControllerActivity"
android:focusable="true"
android:id="@+id/entire_screen"
android:orientation="vertical">
<!-- black bar on top -->
<RelativeLayout
android:id="@+id/top_bar"
android:layout_width="fill_parent"
android:layout_height="80dp"
android:background="@color/background_black">
<ImageView
android:id="@+id/robotIcon"
android:src="@drawable/icon_robotcontroller"
android:layout_width="wrap_content"
android:layout_height="fill_parent"
android:adjustViewBounds="true"
android:layout_margin="1dp"/>
<TextView
android:id="@+id/textDeviceName"
android:layout_toEndOf="@id/robotIcon"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="?attr/textWhite"
android:textAppearance="?android:attr/textAppearanceMedium"
android:padding="8dp"
android:textStyle="bold"/>
<ImageButton
android:id="@+id/menu_buttons"
android:scaleType="fitXY"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:src="@drawable/icon_menu"
android:background="@android:color/transparent"
android:padding="15dp"
android:adjustViewBounds="true"
android:layout_alignParentEnd="true"
android:layout_centerInParent="true"
android:layout_centerHorizontal="true"
android:layout_margin="10dp"/>
</RelativeLayout>
<!-- end of black bar -->
<include layout="@layout/header"
android:id="@+id/included_header"/>
<RelativeLayout
android:background="@color/background_very_light_grey"
android:id="@+id/RelativeLayout"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:keepScreenOn="true"
android:paddingBottom="@dimen/activity_vertical_margin"
android:paddingLeft="@dimen/activity_horizontal_margin"
android:paddingRight="@dimen/activity_horizontal_margin"
android:paddingTop="@dimen/activity_vertical_margin" >
<TextView
android:id="@+id/textNetworkConnectionStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textRobotStatus"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textNetworkConnectionStatus"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textOpMode"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textRobotStatus"
android:visibility="invisible"
android:text="" />
<FrameLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_below="@+id/textOpMode"
android:layout_above="@+id/textGamepad1">
<LinearLayout
android:id="@+id/monitorContainer"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical">
<LinearLayout
android:id="@+id/cameraMonitorViewId"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
android:orientation="vertical"
/>
<FrameLayout
android:id="@+id/tfodMonitorViewId"
android:visibility="gone"
android:layout_width="match_parent"
android:layout_height="0dp"
android:layout_weight="1"
/>
</LinearLayout>
<!-- When visible, the error message will overlay and thus partially
obscure part of the camera monitor. We make this trade off so as to
allow for a bigger camera monitor view in the common case when the
error is not in fact present
-->
<TextView style="@style/FtcTextViewStyleBold"
android:id="@+id/textErrorMessage"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:text=""
android:visibility="invisible"
android:textColor="?attr/textMediumDark" />
</FrameLayout>
<TextView
android:id="@+id/textGamepad1"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_above="@+id/textGamepad2"
android:visibility="invisible"
android:text="" />
<TextView
android:id="@+id/textGamepad2"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:visibility="invisible"
android:text="" />
</RelativeLayout>
<WebView
android:id="@+id/webViewBlocksRuntime"
android:layout_width="1dp"
android:layout_height="1dp" />
</LinearLayout>

View File

@ -0,0 +1,77 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<menu xmlns:android="http://schemas.android.com/apk/res/android" >
<item
android:id="@+id/action_settings"
android:orderInCategory="100"
android:showAsAction="never"
android:title="@string/settings_menu_item"/>
<item
android:id="@+id/action_restart_robot"
android:orderInCategory="200"
android:showAsAction="never"
android:title="@string/restart_robot_menu_item"/>
<item
android:id="@+id/action_configure_robot"
android:orderInCategory="300"
android:showAsAction="never"
android:title="@string/configure_robot_menu_item"/>
<item
android:id="@+id/action_program_and_manage"
android:orderInCategory="550"
android:showAsAction="never"
android:title="@string/program_and_manage_menu_item"/>
<item
android:id="@+id/action_inspection_mode"
android:orderInCategory="600"
android:showAsAction="never"
android:title="@string/inspection_mode_menu_item"/>
<item
android:id="@+id/action_about"
android:orderInCategory="999"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/about_menu_item"/>
<item
android:id="@+id/action_exit_app"
android:orderInCategory="1000"
android:icon="@android:drawable/ic_menu_info_details"
android:title="@string/exit_menu_item"/>
</menu>

Binary file not shown.

Binary file not shown.

View File

@ -0,0 +1,40 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<!-- Default screen margins, per the Android Design guidelines. -->
<dimen name="activity_horizontal_margin">16dp</dimen>
<dimen name="activity_vertical_margin">5dp</dimen>
</resources>

View File

@ -0,0 +1,72 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
strings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<resources>
<string name="app_name">FTC Robot Controller</string>
<!-- Menu -->
<string name="inspection_mode_menu_item">Self Inspect</string>
<string name="program_and_manage_menu_item">Program &amp; Manage</string>
<string name="blocks_menu_item">Blocks</string>
<string name="settings_menu_item">Settings</string>
<string name="restart_robot_menu_item">Restart Robot</string>
<string name="configure_robot_menu_item">Configure Robot</string>
<string name="about_menu_item">About</string>
<string name="exit_menu_item">Exit</string>
<!-- Toast messages -->
<string name="toastWifiConfigurationComplete">Configuration Complete</string>
<string name="toastRestartingRobot">Restarting Robot</string>
<string name="toastWifiUpBeforeProgrammingMode">The Robot Controller must be fully up and running before entering Program and Manage Mode.</string>
<!-- for interpreting pref_app_theme contents. may be override in merged resources -->
<integer-array name="app_theme_ids">
<item>@style/AppThemeRedRC</item>
<item>@style/AppThemeGreenRC</item>
<item>@style/AppThemeBlueRC</item>
<item>@style/AppThemePurpleRC</item>
<item>@style/AppThemeOrangeRC</item>
<item>@style/AppThemeTealRC</item>
</integer-array>
<string name="pref_ftc_season_year_of_current_rc">pref_ftc_season_year_of_current_rc</string>
<string name="packageName">@string/packageNameRobotController</string>
</resources>

View File

@ -0,0 +1,23 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- This is a placeholder for adding future style-specific content in the robot controller only -->
<resources>
<style name="AppThemeRedRC" parent="AppTheme.Red">
</style>
<style name="AppThemeGreenRC" parent="AppTheme.Green">
</style>
<style name="AppThemeBlueRC" parent="AppTheme.Blue">
</style>
<style name="AppThemePurpleRC" parent="AppTheme.Purple">
</style>
<style name="AppThemeOrangeRC" parent="AppTheme.Orange">
</style>
<style name="AppThemeTealRC" parent="AppTheme.Teal">
</style>
</resources>

View File

@ -0,0 +1,93 @@
<?xml version="1.0" encoding="utf-8"?>
<!--
app_settings.xml in FtcRobotController
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
See https://developer.android.com/guide/topics/ui/settings.html
-->
<PreferenceScreen xmlns:tools="http://schemas.android.com/tools"
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
tools:context=".FtcRobotControllerActivity">
<PreferenceCategory
android:title="@string/prefcat_configure_robot">
<EditTextPreference
android:title="@string/prefedit_device_name_rc"
android:summary="@string/prefedit_device_name_summary_rc"
android:key="@string/pref_device_name"
android:defaultValue=""
/>
<org.firstinspires.ftc.robotcore.internal.ui.ColorListPreference
android:title="@string/prefedit_app_theme_rc"
android:summary="@string/prefedit_app_theme_summary_rc"
android:key="@string/pref_app_theme"
android:entries="@array/app_theme_names"
android:entryValues="@array/app_theme_tokens"
app:colors="@array/app_theme_colors"
/>
<SwitchPreference
android:title="@string/prefedit_sound_on_off"
android:summary="@string/prefedit_sound_on_off_summary_rc"
android:key="@string/pref_sound_on_off"
android:defaultValue="true"
/>
<PreferenceScreen
android:title="@string/titleAdvancedRCSettings"
android:summary="@string/summaryAdvancedRCSettings"
android:key="@string/pref_launch_advanced_rc_settings">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.FtcAdvancedRCSettingsActivity"
/>
</PreferenceScreen>
<PreferenceScreen
android:title="@string/prefedit_view_logs"
android:summary="@string/prefedit_view_logs_summary"
android:key="@string/pref_launch_viewlogs">
<intent
android:targetPackage="@string/packageName"
android:targetClass="com.qualcomm.ftccommon.ViewLogsActivity" />
</PreferenceScreen>
</PreferenceCategory>
</PreferenceScreen>

View File

@ -0,0 +1,50 @@
<!--
Copyright (c) 2014, 2015 Qualcomm Technologies Inc
All rights reserved.
Redistribution and use in source and binary forms, with or without modification,
are permitted (subject to the limitations in the disclaimer below) provided that
the following conditions are met:
Redistributions of source code must retain the above copyright notice, this list
of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of Qualcomm Technologies Inc nor the names of its contributors
may be used to endorse or promote products derived from this software without
specific prior written permission.
NO EXPRESS OR IMPLIED LICENSES TO ANY PARTY'S PATENT RIGHTS ARE GRANTED BY THIS
LICENSE. THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE
ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
-->
<!--
https://developer.android.com/guide/topics/connectivity/usb/host
-->
<!-- see also RobotUsbDevice.getUsbIdentifiers() -->
<resources>
<usb-device vendor-id="1027" product-id="24577" /> <!-- FT232 Modern Robotics: 0x0403/0x6001 -->
<usb-device vendor-id="1027" product-id="24597" /> <!-- FT232 Lynx: 0x0403/0x6015 -->
<!-- cameras -->
<!-- We don't currently auto-connect to UVC cameras, instead relying
on the app itself to poll. But we could change that if we wished -->
<!-- Update: turns out we need that if we are to get onNewIntent() notifications
in our activity when cameras attach. See FtcRobotControllerActivity. -->
<usb-device class="14" subclass="2"/>
</resources>

1132
README.md Normal file

File diff suppressed because it is too large Load Diff

21
TeamCode/build.gradle Normal file
View File

@ -0,0 +1,21 @@
//
// build.gradle in TeamCode
//
// Most of the definitions for building your module reside in a common, shared
// file 'build.common.gradle'. Being factored in this way makes it easier to
// integrate updates to the FTC into your code. If you really need to customize
// the build definitions, you can place those customizations in this file, but
// please think carefully as to whether such customizations are really necessary
// before doing so.
// Custom definitions may go here
// Include common definitions from above.
apply from: '../build.common.gradle'
apply from: '../build.dependencies.gradle'
dependencies {
implementation project(':FtcRobotController')
annotationProcessor files('lib/OpModeAnnotationProcessor.jar')
}

Binary file not shown.

View File

@ -0,0 +1,12 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Note: the actual manifest file used in your APK merges this file with contributions
from the modules on which your app depends (such as FtcRobotController, etc).
So it won't ultimately be as empty as it might here appear to be :-) -->
<!-- The package name here determines the package for your R class and your BuildConfig class -->
<manifest
package="org.firstinspires.ftc.teamcode"
xmlns:android="http://schemas.android.com/apk/res/android">
<application/>
</manifest>

View File

@ -0,0 +1,55 @@
package org.firstinspires.ftc.teamcode;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.DigitalChannel;
import com.qualcomm.robotcore.hardware.DistanceSensor;
import com.qualcomm.robotcore.hardware.Gyroscope;
import com.qualcomm.robotcore.hardware.Servo;
@TeleOp
public class MyFIRSTJavaOpMode extends LinearOpMode {
private Gyroscope imu;
private DcMotor frontRight;
private DcMotor frontLeft;
private DcMotor rearRight;
private DcMotor rearLeft;
private DigitalChannel digitalTouch;
private DistanceSensor sensorColorRange;
private Servo servoTest;
@Override
public void runOpMode() {
//imu = hardwareMap.get(Gyroscope.class, "imu");
frontRight = hardwareMap.get(DcMotor.class, "frontRight");
frontLeft = hardwareMap.get(DcMotor.class, "frontLeft");
rearRight = hardwareMap.get(DcMotor.class, "rearRight");
rearLeft = hardwareMap.get(DcMotor.class, "rearLeft");
//digitalTouch = hardwareMap.get(DigitalChannel.class, "digitalTouch");
//sensorColorRange = hardwareMap.get(DistanceSensor.class, "sensorColorRange");
//servoTest = hardwareMap.get(Servo.class, "servoTest");
frontRight.setPower(0.5);
frontLeft.setPower(0.5);
rearLeft.setPower(0.5);
rearRight.setPower(0.5);
telemetry.addData("Status", "Initialized");
telemetry.update();
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
telemetry.addData("Status", "Running");
telemetry.update();
}
}
}

View File

@ -0,0 +1,47 @@
package org.firstinspires.ftc.teamcode;
import com.qualcomm.robotcore.eventloop.opmode.OpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
import com.qualcomm.robotcore.hardware.DcMotor;
@TeleOp
public class controllerOpMode extends OpMode {
private DcMotor frontRight, frontLeft, rearRight, rearLeft;
private double drive,strafe;
private double driveMult = 0.25;
public void init(){
frontRight = hardwareMap.get(DcMotor.class, "frontRight");
frontLeft = hardwareMap.get(DcMotor.class, "frontLeft");
rearRight = hardwareMap.get(DcMotor.class, "rearRight");
rearLeft = hardwareMap.get(DcMotor.class, "rearLeft");
telemetry.addData("Status", "Initialized");
telemetry.update();
}
public void loop() {
strafe = gamepad1.left_stick_x;
drive = gamepad1.left_stick_y;
if (gamepad1.dpad_up){
if (driveMult<=1) {
driveMult += 0.0005;
}
}
if (gamepad1.dpad_down){
if (driveMult>=0) {
driveMult -= 0.0005;
}
}
telemetry.addData("Speed", "Current Speed = " + Math.round(driveMult*100));
telemetry.update();
frontRight.setPower(driveMult*(drive - strafe));
frontLeft.setPower(-1*driveMult*(drive + strafe));
rearRight.setPower(driveMult*(drive + strafe));
rearLeft.setPower(-1*driveMult*(drive - strafe));
}
}

View File

@ -0,0 +1,121 @@
file## TeamCode Module
Welcome!
This module, TeamCode, is the place where you will write/paste the code for your team's
robot controller App. This module is currently empty (a clean slate) but the
process for adding OpModes is straightforward.
## Creating your own OpModes
The easiest way to create your own OpMode is to copy a Sample OpMode and make it your own.
Sample opmodes exist in the FtcRobotController module.
To locate these samples, find the FtcRobotController module in the "Project/Android" tab.
Expand the following tree elements:
FtcRobotController / java / org.firstinspires.ftc.robotcontroller / external / samples
A range of different samples classes can be seen in this folder.
The class names follow a naming convention which indicates the purpose of each class.
The full description of this convention is found in the samples/sample_convention.md file.
A brief synopsis of the naming convention is given here:
The prefix of the name will be one of the following:
* Basic: This is a minimally functional OpMode used to illustrate the skeleton/structure
of a particular style of OpMode. These are bare bones examples.
* Sensor: This is a Sample OpMode that shows how to use a specific sensor.
It is not intended as a functioning robot, it is simply showing the minimal code
required to read and display the sensor values.
* Hardware: This is not an actual OpMode, but a helper class that is used to describe
one particular robot's hardware devices: eg: for a Pushbot. Look at any
Pushbot sample to see how this can be used in an OpMode.
Teams can copy one of these to create their own robot definition.
* Pushbot: This is a Sample OpMode that uses the Pushbot robot structure as a base.
* Concept: This is a sample OpMode that illustrates performing a specific function or concept.
These may be complex, but their operation should be explained clearly in the comments,
or the header should reference an external doc, guide or tutorial.
* Library: This is a class, or set of classes used to implement some strategy.
These will typically NOT implement a full OpMode. Instead they will be included
by an OpMode to provide some stand-alone capability.
Once you are familiar with the range of samples available, you can choose one to be the
basis for your own robot. In all cases, the desired sample(s) needs to be copied into
your TeamCode module to be used.
This is done inside Android Studio directly, using the following steps:
1) Locate the desired sample class in the Project/Android tree.
2) Right click on the sample class and select "Copy"
3) Expand the TeamCode / java folder
4) Right click on the org.firstinspires.ftc.teamcode folder and select "Paste"
5) You will be prompted for a class name for the copy.
Choose something meaningful based on the purpose of this class.
Start with a capital letter, and remember that there may be more similar classes later.
Once your copy has been created, you should prepare it for use on your robot.
This is done by adjusting the OpMode's name, and enabling it to be displayed on the
Driver Station's OpMode list.
Each OpMode sample class begins with several lines of code like the ones shown below:
```
@TeleOp(name="Template: Linear OpMode", group="Linear Opmode")
@Disabled
```
The name that will appear on the driver station's "opmode list" is defined by the code:
``name="Template: Linear OpMode"``
You can change what appears between the quotes to better describe your opmode.
The "group=" portion of the code can be used to help organize your list of OpModes.
As shown, the current OpMode will NOT appear on the driver station's OpMode list because of the
``@Disabled`` annotation which has been included.
This line can simply be deleted , or commented out, to make the OpMode visible.
## ADVANCED Multi-Team App management: Cloning the TeamCode Module
In some situations, you have multiple teams in your club and you want them to all share
a common code organization, with each being able to *see* the others code but each having
their own team module with their own code that they maintain themselves.
In this situation, you might wish to clone the TeamCode module, once for each of these teams.
Each of the clones would then appear along side each other in the Android Studio module list,
together with the FtcRobotController module (and the original TeamCode module).
Selective Team phones can then be programmed by selecting the desired Module from the pulldown list
prior to clicking to the green Run arrow.
Warning: This is not for the inexperienced Software developer.
You will need to be comfortable with File manipulations and managing Android Studio Modules.
These changes are performed OUTSIDE of Android Studios, so close Android Studios before you do this.
Also.. Make a full project backup before you start this :)
To clone TeamCode, do the following:
Note: Some names start with "Team" and others start with "team". This is intentional.
1) Using your operating system file management tools, copy the whole "TeamCode"
folder to a sibling folder with a corresponding new name, eg: "Team0417".
2) In the new Team0417 folder, delete the TeamCode.iml file.
3) the new Team0417 folder, rename the "src/main/java/org/firstinspires/ftc/teamcode" folder
to a matching name with a lowercase 'team' eg: "team0417".
4) In the new Team0417/src/main folder, edit the "AndroidManifest.xml" file, change the line that contains
package="org.firstinspires.ftc.teamcode"
to be
package="org.firstinspires.ftc.team0417"
5) Add: include ':Team0417' to the "/settings.gradle" file.
6) Open up Android Studios and clean out any old files by using the menu to "Build/Clean Project""

View File

@ -0,0 +1 @@
Place your sound files in this folder to use them as project resources.

View File

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<resources>
</resources>

View File

@ -0,0 +1,161 @@
<?xml version='1.0' encoding='UTF-8' standalone='yes' ?>
<!--
This file can provide additional camera calibration settings beyond those built into the SDK itself.
Each calibration is for a particular camera (indicated by USB vid & pid pair) and a particular
capture resolution for the camera. Note: it is very important when capturing images used to calibrate
a camera that the image acquisition tool can actually control this capture resolution within the camera
itself and that you use this setting correctly. Many image acquistion tools do not in fact provide
this level of control.
Beyond simply providing additional, new camera calibrations, calibrations provided herein can
*replace/update* those that are builtin to the SDK. This matching is keyed, of course, by the
(vid, pid, size) triple. Further, if such a calibration has the 'remove' attribute with value 'true',
any existing calibration with that key is removed (and the calibration itself not added).
Calibrations are internally processed according to aspect ratio. If a format is requested in a size
that is not calibrated, but a calibration does exist for the same aspect ratio on the same camera,
then the latter will be scaled to accommodate the request. For example, if a 640x480 calibration
is requested but only a 800x600 calibration exists for that camera, then the 800x600 is scaled
down to service the 640x480 request.
Further, it is important to note that if *no* calibrations exist for a given camera, then Vuforia
is offered the entire range of capture resolutions that the hardware can support (and it does its
best to deal with the lack of calibration). However, if *any* calibrations are provided for a camera,
then capture resolutions in those aspect ratios supported by the camera for which any calibrations
are *not* provided are *not* offered. Thus, if you calibrate a camera but fail to calibrate all
the camera's supported aspect ratios, you limit the choices of capture resolutions that Vuforia can
select from.
One image acquisition program that supports control of camera capture resolution is YouCam 7:
https://www.cyberlink.com/products/youcam/features_en_US.html
Programs that can process acquired images to determine camera calibration settings include:
https://www.3dflow.net/3df-zephyr-free/ (see "Utilities/Images/Launch Camera Calibration" therein)
http://graphics.cs.msu.ru/en/node/909
Note that the type of images that must be acquired in order to calibrate is specific to the
calibration software used.
The required contents are illustrated here by example. Note that for the attribute names, both the
camelCase or the underscore_variations are supported; they are equivalent. The attributes for
each Calibration are as follows:
size (int pair): space separated camera resolution (width, height).
focalLength (float pair): space separated focal length value.
principalPoint (float pair): space separated principal point values (width, height).
distortionCoefficients (an 8-element float array): distortion coefficients in the following form
(r:radial, t:tangential): [r0, r1, t0, t1, r2, r3, r4, r5]
see https://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html
The examples here are commented out as the values are built-in to the FTC SDK. They serve instead
here as examples on how make your own.
-->
<Calibrations>
<!-- ======================================================================================= -->
<!-- Microsoft Lifecam HD 3000 v1, Calibrated by PTC using unknown tooling -->
<!-- <Camera vid="Microsoft" pid="0x0779">
<Calibration
size="640 480"
focalLength="678.154f, 678.17f"
principalPoint="318.135f, 228.374f"
distortionCoefficients="0.154576f, -1.19143f, 0f, 0f, 2.06105f, 0f, 0f, 0f"
/>
</Camera> -->
<!-- ======================================================================================= -->
<!-- Microsoft Lifecam HD 3000 v2, Calibrated by PTC using unknown tooling -->
<!-- <Camera vid="Microsoft" pid="0x0810">
<Calibration
size="640 480"
focalLength="678.154f, 678.17f"
principalPoint="318.135f, 228.374f"
distortionCoefficients="0.154576f, -1.19143f, 0f, 0f, 2.06105f, 0f, 0f, 0f"
/>
</Camera> -->
<!-- ======================================================================================= -->
<!-- Logitech HD Webcam C310, Calibrated by by Robert Atkinson, 2018.05.30 using 3DF Zephyr -->
<!-- <Camera vid="Logitech" pid="0x081B">
<Calibration
size="640 480"
focalLength="821.993f, 821.993f"
principalPoint="330.489f, 248.997f"
distortionCoefficients="-0.018522, 1.03979, 0, 0, -3.3171, 0, 0, 0"
/>
<Calibration
size="640 360"
focalLength="715.307f, 715.307f"
principalPoint="319.759f, 188.917f"
distortionCoefficients="-0.0258948, 1.06258, 0, 0, -3.40245, 0, 0, 0"
/>
</Camera> -->
<!-- ======================================================================================= -->
<!-- Logitech HD Pro Webcam C920, Calibrated by Robert Atkinson, 2018.05.30 using 3DF Zephyr -->
<!-- <Camera vid="Logitech" pid="0x082D">
<Calibration
size="640 480"
focalLength="622.001f, 622.001f"
principalPoint="319.803f, 241.251f"
distortionCoefficients="0.1208, -0.261599, 0, 0, 0.10308, 0, 0, 0"
/>
<Calibration
size="800 600"
focalLength="775.79f, 775.79f"
principalPoint="400.898f, 300.79f"
distortionCoefficients="0.112507, -0.272067, 0, 0, 0.15775, 0, 0, 0"
/>
<Calibration
size="640 360"
focalLength="463.566f, 463.566f"
principalPoint="316.402f, 176.412f"
distortionCoefficients="0.111626 , -0.255626, 0, 0, 0.107992, 0, 0, 0"
/>
<Calibration
size="1920, 1080"
focalLength="1385.92f , 1385.92f"
principalPoint="951.982f , 534.084f"
distortionCoefficients="0.117627, -0.248549, 0, 0, 0.107441, 0, 0, 0"
/>
<Calibration
size="800, 448"
focalLength="578.272f , 578.272f"
principalPoint="402.145f , 221.506f"
distortionCoefficients="0.12175, -0.251652 , 0, 0, 0.112142, 0, 0, 0"
/>
<Calibration
size="864, 480"
focalLength="626.909f , 626.909f"
principalPoint="426.007f , 236.834f"
distortionCoefficients="0.120988, -0.253336 , 0, 0, 0.102445, 0, 0, 0"
/>
</Camera> -->
<!-- ======================================================================================= -->
<!-- Logitech HD Webcam C270, Calibrated by Noah Andrews, 2019.03.13 using 3DF Zephyr -->
<!--<Camera vid="Logitech" pid="0x0825">
<Calibration
size="640 480"
focalLength="822.317f, 822.317f"
principalPoint="319.495f, 242.502f"
distortionCoefficients="-0.0449369, 1.17277, 0, 0, -3.63244, 0, 0, 0"
/>
</Camera> -->
<!-- ======================================================================================= -->
</Calibrations>

130
build.common.gradle Normal file
View File

@ -0,0 +1,130 @@
/**
* build.common.gradle
*
* Try to avoid editing this file, as it may be updated from time to time as the FTC SDK
* evolves. Rather, if it is necessary to customize the build process, do those edits in
* the build.gradle file in TeamCode.
*
* This file contains the necessary content of the 'build.gradle' files for robot controller
* applications built using the FTC SDK. Each individual 'build.gradle' in those applications
* can simply contain the one line:
*
* apply from: '../build.common.gradle'
*
* which will pick up this file here. This approach allows makes it easier to integrate
* updates to the FTC SDK into your code.
*/
import java.util.regex.Pattern
apply plugin: 'com.android.application'
android {
compileSdkVersion 29
signingConfigs {
release {
def apkStoreFile = System.getenv("APK_SIGNING_STORE_FILE")
if (apkStoreFile != null) {
keyAlias System.getenv("APK_SIGNING_KEY_ALIAS")
keyPassword System.getenv("APK_SIGNING_KEY_PASSWORD")
storeFile file(System.getenv("APK_SIGNING_STORE_FILE"))
storePassword System.getenv("APK_SIGNING_STORE_PASSWORD")
} else {
keyAlias 'androiddebugkey'
keyPassword 'android'
storeFile rootProject.file('libs/ftc.debug.keystore')
storePassword 'android'
}
}
debug {
keyAlias 'androiddebugkey'
keyPassword 'android'
storeFile rootProject.file('libs/ftc.debug.keystore')
storePassword 'android'
}
}
aaptOptions {
noCompress "tflite"
}
defaultConfig {
signingConfig signingConfigs.debug
applicationId 'com.qualcomm.ftcrobotcontroller'
minSdkVersion 23
//noinspection ExpiredTargetSdkVersion
targetSdkVersion 28
/**
* We keep the versionCode and versionName of robot controller applications in sync with
* the master information published in the AndroidManifest.xml file of the FtcRobotController
* module. This helps avoid confusion that might arise from having multiple versions of
* a robot controller app simultaneously installed on a robot controller device.
*
* We accomplish this with the help of a funky little Groovy script that maintains that
* correspondence automatically.
*
* @see <a href="http://developer.android.com/tools/building/configuring-gradle.html">Configure Your Build</a>
* @see <a href="http://developer.android.com/tools/publishing/versioning.html">Versioning Your App</a>
*/
def manifestFile = project(':FtcRobotController').file('src/main/AndroidManifest.xml');
def manifestText = manifestFile.getText()
//
def vCodePattern = Pattern.compile("versionCode=\"(\\d+(\\.\\d+)*)\"")
def matcher = vCodePattern.matcher(manifestText)
matcher.find()
def vCode = Integer.parseInt(matcher.group(1))
//
def vNamePattern = Pattern.compile("versionName=\"(.*)\"")
matcher = vNamePattern.matcher(manifestText);
matcher.find()
def vName = matcher.group(1)
//
versionCode vCode
versionName vName
}
// Advanced user code might just want to use Vuforia directly, so we set up the libs as needed
// http://google.github.io/android-gradle-dsl/current/com.android.build.gradle.internal.dsl.BuildType.html
buildTypes {
release {
signingConfig signingConfigs.release
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
debug {
debuggable true
jniDebuggable true
renderscriptDebuggable true
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
}
compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}
packagingOptions {
pickFirst '**/*.so'
}
sourceSets.main {
jni.srcDirs = []
jniLibs.srcDir rootProject.file('libs')
}
ndkVersion '21.3.6528147'
}
repositories {
flatDir {
dirs rootProject.file('libs')
}
}

23
build.dependencies.gradle Normal file
View File

@ -0,0 +1,23 @@
repositories {
mavenCentral()
google() // Needed for androidx
jcenter() // Needed for tensorflow-lite
flatDir {
dirs rootProject.file('libs')
}
}
dependencies {
implementation 'org.firstinspires.ftc:Inspection:7.0.0'
implementation 'org.firstinspires.ftc:Blocks:7.0.0'
implementation 'org.firstinspires.ftc:Tfod:7.0.0'
implementation 'org.firstinspires.ftc:RobotCore:7.0.0'
implementation 'org.firstinspires.ftc:RobotServer:7.0.0'
implementation 'org.firstinspires.ftc:OnBotJava:7.0.0'
implementation 'org.firstinspires.ftc:Hardware:7.0.0'
implementation 'org.firstinspires.ftc:FtcCommon:7.0.0'
implementation 'org.tensorflow:tensorflow-lite-task-vision:0.2.0'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'org.firstinspires.ftc:gameAssets-FreightFrenzy:1.0.0'
}

69
build.gradle Normal file
View File

@ -0,0 +1,69 @@
/**
* Top-level build file for ftc_app project.
*
* It is extraordinarily rare that you will ever need to edit this file.
*/
configurations {
doc { transitive false }
}
buildscript {
repositories {
mavenCentral()
google()
jcenter()
}
dependencies {
classpath 'com.android.tools.build:gradle:4.0.1'
}
}
// This is now required because aapt2 has to be downloaded from the
// google() repository beginning with version 3.2 of the Android Gradle Plugin
allprojects {
repositories {
mavenCentral()
google()
jcenter()
}
}
repositories {
mavenCentral()
flatDir {
dirs '../libs'
}
}
dependencies {
doc 'org.firstinspires.ftc:Hardware:6.2.0'
doc 'org.firstinspires.ftc:RobotCore:6.2.0'
doc 'org.firstinspires.ftc:FtcCommon:6.2.0'
doc 'org.firstinspires.ftc:OnBotJava:6.2.0'
doc 'org.firstinspires.ftc:Inspection:6.2.0'
}
task extractJavadoc {
doLast {
def componentIds = configurations.doc.incoming.resolutionResult.allDependencies.collect { it.selected.id }
def result = dependencies.createArtifactResolutionQuery()
.forComponents(componentIds)
.withArtifacts(JvmLibrary, SourcesArtifact, JavadocArtifact)
.execute()
for (component in result.resolvedComponents) {
component.getArtifacts(JavadocArtifact).each { artifact ->
def version = artifact.identifier.componentIdentifier.version
def libName = artifact.identifier.componentIdentifier.moduleIdentifier.name
copy {
from zipTree(artifact.file)
into "docs/$version/$libName/"
}
}
}
}
}

View File

@ -0,0 +1,21 @@
The sound files listed below in this SDK were purchased from www.audioblocks.com under the
following license.
http://support.audioblocks.com/customer/en/portal/topics/610636-licensing-faq-s/articles
How am I allowed to use your content?
Last Updated: Aug 11, 2016 01:51PM EDT
Our content may be used for nearly any project, commercial or otherwise, including feature
films, broadcast, commercial, industrial, educational video, print projects, multimedia,
games, and the internet, so long as substantial value is added to the content. (For example,
incorporating an audio clip into a commercial qualifies, while reposting our audio clip on
YouTube with no modification or no visual component does not.) Once you download a file it is
yours to keep and use forever, royalty- free, even if you change your subscription or cancel
your account.
List of applicable sound files
chimeconnect.wav
chimedisconnect.wav
errormessage.wav
warningmessage.wav

View File

@ -0,0 +1,15 @@
EXHIBIT A - LEGO® Open Source License Agreement
The contents of the file 'nxtstartupsound.wav' contained in this SDK are subject to the
LEGO® Open Source License Agreement Version 1.0 (the "License"); you may not use this
file except in compliance with the License. You may obtain a copy of the License
at "LEGO Open Source License.pdf" contained in the same directory as this exhibit.
Software distributed under the License is distributed on an "AS IS" basis, WITHOUT
WARRANTY OF ANY KIND, either express or implied. See the License for the specific
language governing rights and limitations under the License.
The Original Code is <firmwareRoot>\AT91SAM7S256\Resource\SOUNDS\!Startup.rso.
LEGO is the owner of the Original Code. Portions created by Robert Atkinson are
Copyright (C) 2015. All Rights Reserved.
Contributor(s): Robert Atkinson.

Binary file not shown.

BIN
doc/media/PullRequest.PNG Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Some files were not shown because too many files have changed in this diff Show More